Skip to content

bsraya/optimization-algorithms

Repository files navigation

Optimization Algorithms

This repository contains implementations of various optimization algorithms that you might have encountered while using Keras or (Pytorch)[https://pytorch.org/docs/stable/optim.html].

It is an attempt of writing the algorithms in a way that is easy to understand and use from scratch.

Here is the progress of the project:

  • Batch Gradient Descent ✅
  • Mini-Batch Gradient Descent ✅
  • Stochastic Gradient Descent ✅
  • SGD with Momentum
  • SGD with weight decay
  • SGD with Momentum, Nesterov, and weight decay

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published