This repository contains implementations of various optimization algorithms that you might have encountered while using Keras or (Pytorch)[https://pytorch.org/docs/stable/optim.html].
It is an attempt of writing the algorithms in a way that is easy to understand and use from scratch.
Here is the progress of the project:
- Batch Gradient Descent ✅
- Mini-Batch Gradient Descent ✅
- Stochastic Gradient Descent ✅
- SGD with Momentum
- SGD with weight decay
- SGD with Momentum, Nesterov, and weight decay