Skip to content

Manaswip/MNIST-classification-without-libraries-using-neural-networks

Repository files navigation

MNIST dataset classification with out libraries

MNIST is a large database of handwritten digits. In this project MNIST dataset is trained using Neural Networks and to achieve this any special libraries like Theano or scikit learn etc. are not used rather 'scipy.optimize.fmin_cg' function is used to minimize the cost function of algorithm. This scipy.optimize.fmin_cg function takes 2 inputs mainly:
1. A Function which returns cost function of neural nets at given weights and
2. A function which returns gradient of weights. Backpropogation algorithm is used to calculate gradient of weights.
In special libraries like Theano or scikit learn etc. we do not have to explicity give function to calculate gradient of weights.
Neural network contains 4 layers:
1. Input layer contains 28X28 neurons
2. Hidden layer-1 contains 300 units
3. Hidden layer-2 contains 300 units
4. Output layer contains 10 neurons (as 10 classes).
Quadratic cost function , sigmoid activation function and Gradient descent is used to minimize cost function in this project.

About files in this project

ExtractingData.py: To extract training images and labels from 'train-images-idx3-ubyte' and 'train-labels-idx1-ubyte' packages respectively
ExtractingTestData.py: To extract testing images and labels from 't10-images-idx3-ubyte' and 't10-labels-idx1-ubyte' packages respectively
CostFunction.py Returns the cost of neural network for given weights
WeightsGradient.py Returns gradient of weights for given weights
sigmoid.py Returns sigmoid activation function
sigmoidGradient.py Returns gradient if sigmoid activation function
RandomInitializeWeights.py Radomly initializes weights of given size
ComputeNumericalGradient.py Returns numerical gradient of weights which is given by dy/dx = (y2-y1)/(x2-x1)
DebugInitialWeights.py Returns randomly initialized weights of given size but weights generated using this program as opposed to weights generated by RandomInitializeWeights.py are used to check if the gradient value returned by WeightsGradient.py is close to gradient value retuned by ComputeNumericalGradient.py
CompareNeuralNetGradientWithNumericalGradient.py This programs is used to check if the gradient value returned by WeightsGradient.py is close to gradient value retuned by ComputeNumericalGradient.py
MNISTTraining.pyThis program learns weights in such a way that the cost function of neural networks is minimized and thus increase accuracy with each iteration. 'scipy.optimize.fmin_cg' is used to minimize the cost function

Sample code to run program

import MNISTTraining

Accuracy

Accuracy of 93.2% was attained after running the algorithm for 150 iterations. Weights with which this accuracy was achievied are stored in weights.npy

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages