Skip to content

This document develops some first principles required to build an L-deep Neural Network by building functions that perform forward propagation, backward propagation, gradient descent, and parameter update.

Notifications You must be signed in to change notification settings

escalante-cr/Building_L_deep_NN_from_scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 

Repository files navigation

This is part of DeepLearning.AI's specialization in Deep Learning. Use the document to review procedures used in setting up features, weights, and biases for forward propagation, backward propagation, gradient descent, and parameter updates. Uses ReLu activation functions for L-1 layers and Sigmoid activation for the output layer. This example assumes only 1 output neuron.

About

This document develops some first principles required to build an L-deep Neural Network by building functions that perform forward propagation, backward propagation, gradient descent, and parameter update.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published