Skip to content
This repository has been archived by the owner on Jun 7, 2020. It is now read-only.

hqr/gorann

Repository files navigation

GoRANN

Recurrent and Artificial Neural Networks

The vanishing gradient problem of automatic differentiation or backpropagation in neural networks was partially overcome in 1992 by an early generative model called the neural history compressor, implemented as an unsupervised stack of recurrent neural networks... (Wikipedia)

Overview

This is a from-scratch Golang implementation, inspired in part by the Stanford ML course.

Keywords

Gradient descent, SGD and mini-batching, L1/L2 regularization, Adagrad, Adadelta, RMSprop and ADAM optimization algorithms, hyper-parameters and other tunables, activation functions, input and output normalization.

Plus:

  • RNNs: naive, fully connected ("unrolled"), and partially connected ("limited") - tested here
  • Super-NN, weighted aggregation of Neural Networks - described here

And also:

  • Natural Evolution Strategies (NES)
  • Parallel computing: SGD and NES (initial)
  • And more

Install

go get github.com/hqr/gorann

Test and run

See Makefile for test, lint, command-line help, and other useful targets.

About

Neural Networks in Go

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published