Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding SGD, Adam and RMSProp optimizers #42

Merged
merged 2 commits into from
Aug 10, 2017
Merged

Adding SGD, Adam and RMSProp optimizers #42

merged 2 commits into from
Aug 10, 2017

Conversation

pavanky
Copy link
Member

@pavanky pavanky commented Aug 9, 2017

@pavanky pavanky added this to the 0.1 milestone Aug 9, 2017
@pavanky pavanky requested a review from umar456 August 9, 2017 08:41
- Moved zeroGrad to be part of Optimizer class
- Renamed perceptron.cpp to xor.cpp
- Modified xor example to run with SGD, Adam, or RMSProp optimizers
@umar456 umar456 merged commit 7320d86 into master Aug 10, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants