Skip to content

kyunghyuncho/deepmat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

6fd1334 · Jul 28, 2014

History

69 Commits
Jul 28, 2014
Jul 15, 2013
Jul 15, 2013
Jul 15, 2013
Jul 18, 2013
Apr 1, 2013
Apr 1, 2013
Apr 1, 2013
Feb 5, 2014
Jul 18, 2013
Aug 13, 2013
Jul 15, 2013
Oct 12, 2013
Apr 1, 2013
Jan 16, 2014
Apr 15, 2013
Apr 3, 2013
Apr 1, 2013
Apr 1, 2013
Apr 1, 2013
Apr 1, 2013
Apr 1, 2013
Jan 6, 2014
May 13, 2013
Apr 1, 2013
Apr 1, 2013
Jul 5, 2013
Jun 13, 2013
Feb 17, 2014
May 13, 2013
May 13, 2013
Apr 1, 2013
Apr 1, 2013
Feb 17, 2014
Apr 1, 2013
Jul 10, 2013
Jul 4, 2013
Apr 1, 2013
Apr 1, 2013
Oct 25, 2013
Aug 13, 2013
Oct 12, 2013
Jul 23, 2013
Apr 1, 2013
Feb 17, 2014
Apr 1, 2013
Apr 1, 2013
Apr 1, 2013
Apr 1, 2013
Feb 17, 2014
Apr 1, 2013
Apr 1, 2013
Jun 10, 2014
Apr 1, 2013
Apr 24, 2013
Apr 1, 2013
Apr 1, 2013
May 13, 2013
Aug 13, 2013
Apr 1, 2013
Apr 1, 2013
Apr 1, 2013
Jun 27, 2013
Apr 1, 2013
Jul 17, 2013
Jul 17, 2013

Repository files navigation

deepmat

WARNING: this is not my main code, and there is no warranty attached!

= Generative Stochastic Network =

  • A simple implementation of GSN according to (Bengio et al., 2013)

= Convolutional Neural Network =

  • A naive implementation (purely using Matlab)
  • Pooling: max (Jonathan Masci's code) and average
  • Not for serious use!

= Restricted Boltzmann Machine & Deep Belief Networks =

  • Binary/Gaussian Visible Units + Binary Hidden Units
  • Enhanced Gradient, Adaptive Learning Rate
  • Adadelta for RBM
  • Contrastive Divergence
  • (Fast) Persistent Contrastive Divergence
  • Parallel Tempering
  • DBN: Up-down Learning Algorithm

= Deep Boltzmann Machine =

  • Binary/Gaussian Visible Units + Binary Hidden Units
  • (Persistent) Contrastive Divergence
  • Enhanced Gradient, Adaptive Learning Rate
  • Two-stage Pretraining Algorithm (example)
  • Centering Trick (fixed center variables only)

= Denoising Autoencoder (Tied Weights) =

  • Binary/Gaussian Visible Units + Binary(Sigmoid)/Gaussian Hidden Units
  • tanh/sigm/relu nonlinearities
  • Shallow: sparsity, contractive, soft-sparsity (log-cosh) regularization
  • Deep: stochastic backprop
  • Adagrad, Adadelta

= Multi-layer Perceptron =

  • Stochastic Backpropagation, Dropout
  • tanh/sigm/relu nonlinearities
  • Adagrad, Adadelta
  • Balanced minibatches using crossvalind()

About

Matlab Code for Restricted/Deep Boltzmann Machines and Autoencoders

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published