A Neural Network library written in Rust.
Test 1 Training on MNIST data set with following configuration:
- Two hidden layers with 50 activation functions (RELU)
- Softmax classifier
- Minibatch size 200
- Learning rate 0.05
- No regularization
- Minibatch optimizer
Test 2 Training on MNIST data set with following configuration:
- Two hidden layers with 50 activation functions (RELU)
- Softmax classifier
- Minibatch size 200
- Learning rate 0.01
- No regularization
- Adam optimizer
Test results
- With Adam optimizer we see ~95% accuracy on ~385 iterations
- Without Adam we see ~ 93% accuracy after 1200 iteration.
Thus with Adam optimizer we observe a reach to a slightly higher accuracy ~3.1 times faster.
The library supports:
- L2 regularization
- Minibatch, Momentum, RMSProp and Adam optimizations
- Only Dense layers so far.
- No GPU acceleration yet