Skip to content

Latest commit

 

History

History
59 lines (45 loc) · 1.93 KB

README.md

File metadata and controls

59 lines (45 loc) · 1.93 KB

Neural Network Experiments

A collection of codes and notebooks implementing various types of neural networks in Numpy. As expected, codes are not optimized, and are incapable of utilizing GPU power!

Fundamentals of Neural Networks

  1. Softmax Classification on MNIST

  2. Stochastic, mini-Batch, Batch Gradient Descent, and Dataloaders

  3. Optimizers (Moment, RMSProp, Adam)

  4. Regularization (L1, L2, Dropout, Batchnorm)

    • Code:
    • Notebook:
  5. Convolutional Neural Networks (CNN) on MNIST

    • Code:
    • Notebook:
  6. Recurrent Neural Networks (RNN) on MNIST

    • Code:
    • Notebook:
  7. Generative Adversarial Networks (GANs) on MNIST

    • Code:
    • Notebook:

Building Your Own Deep Learning Framework

  1. Sequential Layers

    • Code:
    • Notebook:
  2. Autograd

    • Code:
    • Notebook:

Application of Neural Neworks

  1. Linear and CNN Autoencoder on MNIST

    • Code:
    • Notebook:
  2. Sentiment Classification and Word Embedding on IMDB Movie Review Dataset

    • Code:
    • Notebook:
  3. Character-level RNN

    • Code:
    • Notebook:

Classical Machine Learning

  1. Principal Component Analysis (PCA)