Skip to content

Commit

Permalink
update entropy chapter
Browse files Browse the repository at this point in the history
  • Loading branch information
Tobias-Brock committed Nov 10, 2023
1 parent 0546ee7 commit f008138
Show file tree
Hide file tree
Showing 10 changed files with 27 additions and 12 deletions.
Binary file modified .DS_Store
Binary file not shown.
Binary file modified content/.DS_Store
Binary file not shown.
Binary file modified content/chapters/.DS_Store
Binary file not shown.
15 changes: 15 additions & 0 deletions content/chapters/13_information_theory/13-02-entropy.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
title: "Chapter 13.02: Entropy"
weight: 13002
---
We continue our discussion about entropy and introduce joint entropy, the uniqueness theorem and the maximum entropy principle.

<!--more-->

### Lecture video

{{< video id="" >}}

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-info-entropy2.pdf" >}}
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 13.02: Differential Entropy"
weight: 13002
title: "Chapter 13.03: Differential Entropy"
weight: 13003
---
In this section, we extend the definition of entropy to the continuous case.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 13.03: Kullback-Leibler Divergence"
weight: 13003
title: "Chapter 13.04: Kullback-Leibler Divergence"
weight: 13004
---
The Kullback-Leibler divergence (KL) is an important quantity for measuring the difference between two probability distributions. We discuss different intuitions for KL and relate it to risk minimization and likelihood ratios.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 13.04: Entropy and Optimal Code Length"
weight: 13004
title: "Chapter 13.05: Entropy and Optimal Code Length"
weight: 13005
---
In this section, we introduce source coding and discuss how entropy can be understood as optimal code length.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 13.05: Cross-Entropy, KL and Source Coding"
weight: 13005
title: "Chapter 13.06: Cross-Entropy, KL and Source Coding"
weight: 13006
---
We introduce cross-entropy as a further information-theoretic concept and discuss the connection between entropy, cross-entropy, and Kullback-Leibler divergence.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 13.06: Information Theory for Machine Learning"
weight: 13006
title: "Chapter 13.07: Information Theory for Machine Learning"
weight: 13007
---
In this section, we discuss how information-theoretic concepts are used in machine learning and demonstrate the equivalence of KL minimization and maximum likelihood maximization, as well as how (cross-)entropy can be used as a loss function.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 13.07: Joint Entropy and Mutual Information"
weight: 13007
title: "Chapter 13.08: Joint Entropy and Mutual Information"
weight: 13008
---
Information theory also provides means of quantifying relations between two random variables that extend the concept of (linear) correlation. We discuss joint entropy, conditional entropy, and mutual information in this context.

Expand Down

0 comments on commit f008138

Please sign in to comment.