Skip to content

Commit

Permalink
Update notes
Browse files Browse the repository at this point in the history
  • Loading branch information
Jonas1312 committed Jul 8, 2024
1 parent ec068b4 commit 0a99539
Show file tree
Hide file tree
Showing 3 changed files with 8 additions and 5 deletions.
2 changes: 2 additions & 0 deletions base/science-tech-maths/machine-learning/machine-learning.md
Original file line number Diff line number Diff line change
Expand Up @@ -533,6 +533,8 @@ Many papers use the term long-tail learning to refer to class imbalance in multi

## Gradient descent

The gradient measures the direction and the fastest rate of increase of a function $f$ at a given point.

**Gradient always points in the direction of steepest ascent.**

$$w^{(t+1)}=w^{(t)} - \eta\frac{\partial L}{\partial w}$$
Expand Down
6 changes: 3 additions & 3 deletions base/science-tech-maths/machine-learning/metrics/metrics.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,8 @@

## Precision-Recall curve vs ROC curve

- Precision: how often the classifier is correct when it predicts positive $PRE = \frac{TP}{TP+FP}$
- Recall: how often the classifier is correct for all positive instances
- Precision: how often the classifier is correct when it predicts positive $PRE = \frac{TP}{TP+FP}$ "Of all the apples I picked from the basket, how many are actually good?"
- Recall: how often the classifier is correct for all positive instances $REC = \frac{TP}{TP+FN}$ "Of all the good apples available, how many did I actually pick?"
- [The Precision-Recall Plot Is More Informative than the ROC Plot When Evaluating Binary Classifiers on Imbalanced Datasets](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0118432)
- Indeed, ROC is useful when evaluating general-purpose classification, while AUPRC is the superior method when classifying rare events.
- <https://towardsdatascience.com/why-you-should-stop-using-the-roc-curve-a46a9adc728>
Expand All @@ -30,7 +30,7 @@ Different threshold values will give different precision and recall values, and

The AP is the area under the precision-recall curve.

In object detection, the mAP is the mean of the average precision (AP) accross all classes.
In object detection, the mAP is the mean of the average precision (AP) across all classes.

## MCC

Expand Down
5 changes: 3 additions & 2 deletions base/science-tech-maths/maths/maths/maths.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,12 +43,13 @@ The second largest eigenvector is always orthogonal to the largest eigenvector,
If all eigenvalues of A are:

- positive: the matrix is positive definite.
- positive or zero: positive semi-definite.
- positive or zero: positive semi-definite.
- negative: the matrix is negative definite.

## Derivatives

- the gradient of $f$ is the vector containing all of the partial derivatives
- The gradient of $f$ is the vector containing all of the partial derivatives.
- The Jacobian is the matrix of all first-order partial derivatives of a vector-valued function (a function that takes a vector as an input and returns a vector as an output).
- Hessian is the Jacobian of the gradient. Is symmetric.

### Finite differences
Expand Down

0 comments on commit 0a99539

Please sign in to comment.