Skip to content

Commit 9683a8b

Browse files
author
Quinn Liu
committed
1) new gradient descent notes 2) image of gradient descent in action reaching 2 different local minimum
1 parent 0a8d848 commit 9683a8b

File tree

2 files changed

+1
-2
lines changed

2 files changed

+1
-2
lines changed
Loading

supervisedLearning/README.md

+1-2
Original file line numberDiff line numberDiff line change
@@ -13,5 +13,4 @@ supervisedLearning
1313
- math notation = minimize over theta_0, theta_1 the cost function J(theta_0, theta_1) also called the squared error function
1414
- WHOLE POINT explained using a picture: https://github.com/quinnliu/MachineLearning/blob/master/imagesForExplanation/CostFunctionExampleWithTheta_0AndTheta_1.jpg
1515
- Now plugging in the minimal theta_0 and theta_1 our function h(x) = theta_0 + theta_1 * x will predict h(x) = y by giving it an input x.
16-
17-
- GRADIENT DESCENT = algorithm that lets us find a minimal theta_0 and theta_1
16+
- But how do we find the minimal theta_0 and theta_1?! GRADIENT DESCENT = algorithm that lets us find a minimal theta_0 and theta_1. It can also be used to minimize any arbitrary function J.

0 commit comments

Comments
 (0)