I wrote this toy tensor library for practice/fun after work. I use numpy as the base for my Tensor class and implemented autograd. As of right now I defined enough classes in order to train a simple MNIST fully-connected classifier.
It has an SGD and Adam implementation as the optimizers and two loss functions.
You'll never need PyTorch again!
- Add CNNs
- Add BatchNorm
- Add Dropout
- Probably a million other features