Skip to content

Latest commit

 

History

History
16 lines (11 loc) · 742 Bytes

File metadata and controls

16 lines (11 loc) · 742 Bytes

Machine-Learning-Neural-Networks

Coursera Machine Learning

Stanford University

Implement the backpropagation algorithm for neural networks and apply it to the task of hand-written digit recognition.

The development process consists of the following functional steps: (1) Random Weight Initialization, (2) Forward Propagation Development, (3) Cost Function Computation, (4) Back Propagation to Compute Partial Derivatives, (5) Use Gradient Checking to compare partial derivative computation for Back Propagation to the numerical estimate of cost function gradient, (6) Disable Gradient checking Code, (7) Use Gradiant Descent and Back Propagation to minimize the Cost Function.

Octave/MATLAB is used for the development environment.