diff --git a/.DS_Store b/.DS_Store index 2fb1c616..549743d3 100644 Binary files a/.DS_Store and b/.DS_Store differ diff --git a/content/.DS_Store b/content/.DS_Store index 03ddea92..5806f5dc 100644 Binary files a/content/.DS_Store and b/content/.DS_Store differ diff --git a/content/chapters/.DS_Store b/content/chapters/.DS_Store index 62505c6c..6cb5f2a1 100644 Binary files a/content/chapters/.DS_Store and b/content/chapters/.DS_Store differ diff --git a/content/chapters/13_information_theory/13-02-entropy.md b/content/chapters/13_information_theory/13-02-entropy.md new file mode 100644 index 00000000..942c4fd4 --- /dev/null +++ b/content/chapters/13_information_theory/13-02-entropy.md @@ -0,0 +1,15 @@ +--- +title: "Chapter 13.02: Entropy" +weight: 13002 +--- +We continue our discussion about entropy and introduce joint entropy, the uniqueness theorem and the maximum entropy principle. + + + +### Lecture video + +{{< video id="" >}} + +### Lecture slides + +{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-info-entropy2.pdf" >}} \ No newline at end of file diff --git a/content/chapters/13_information_theory/13-02-diffent.md b/content/chapters/13_information_theory/13-03-diffent.md similarity index 82% rename from content/chapters/13_information_theory/13-02-diffent.md rename to content/chapters/13_information_theory/13-03-diffent.md index 147d45de..333806df 100644 --- a/content/chapters/13_information_theory/13-02-diffent.md +++ b/content/chapters/13_information_theory/13-03-diffent.md @@ -1,6 +1,6 @@ --- -title: "Chapter 13.02: Differential Entropy" -weight: 13002 +title: "Chapter 13.03: Differential Entropy" +weight: 13003 --- In this section, we extend the definition of entropy to the continuous case. diff --git a/content/chapters/13_information_theory/13-03-kl.md b/content/chapters/13_information_theory/13-04-kl.md similarity index 86% rename from content/chapters/13_information_theory/13-03-kl.md rename to content/chapters/13_information_theory/13-04-kl.md index 9e772be8..d894f021 100644 --- a/content/chapters/13_information_theory/13-03-kl.md +++ b/content/chapters/13_information_theory/13-04-kl.md @@ -1,6 +1,6 @@ --- -title: "Chapter 13.03: Kullback-Leibler Divergence" -weight: 13003 +title: "Chapter 13.04: Kullback-Leibler Divergence" +weight: 13004 --- The Kullback-Leibler divergence (KL) is an important quantity for measuring the difference between two probability distributions. We discuss different intuitions for KL and relate it to risk minimization and likelihood ratios. diff --git a/content/chapters/13_information_theory/13-04-sourcecoding.md b/content/chapters/13_information_theory/13-05-sourcecoding.md similarity index 81% rename from content/chapters/13_information_theory/13-04-sourcecoding.md rename to content/chapters/13_information_theory/13-05-sourcecoding.md index d977103b..17b1ffb1 100644 --- a/content/chapters/13_information_theory/13-04-sourcecoding.md +++ b/content/chapters/13_information_theory/13-05-sourcecoding.md @@ -1,6 +1,6 @@ --- -title: "Chapter 13.04: Entropy and Optimal Code Length" -weight: 13004 +title: "Chapter 13.05: Entropy and Optimal Code Length" +weight: 13005 --- In this section, we introduce source coding and discuss how entropy can be understood as optimal code length. diff --git a/content/chapters/13_information_theory/13-05-cross-entropy-kld.md b/content/chapters/13_information_theory/13-06-cross-entropy-kld.md similarity index 83% rename from content/chapters/13_information_theory/13-05-cross-entropy-kld.md rename to content/chapters/13_information_theory/13-06-cross-entropy-kld.md index 1a740699..0d5daed6 100644 --- a/content/chapters/13_information_theory/13-05-cross-entropy-kld.md +++ b/content/chapters/13_information_theory/13-06-cross-entropy-kld.md @@ -1,6 +1,6 @@ --- -title: "Chapter 13.05: Cross-Entropy, KL and Source Coding" -weight: 13005 +title: "Chapter 13.06: Cross-Entropy, KL and Source Coding" +weight: 13006 --- We introduce cross-entropy as a further information-theoretic concept and discuss the connection between entropy, cross-entropy, and Kullback-Leibler divergence. diff --git a/content/chapters/13_information_theory/13-06-ml.md b/content/chapters/13_information_theory/13-07-ml.md similarity index 84% rename from content/chapters/13_information_theory/13-06-ml.md rename to content/chapters/13_information_theory/13-07-ml.md index 511e4b2c..6feb14ea 100644 --- a/content/chapters/13_information_theory/13-06-ml.md +++ b/content/chapters/13_information_theory/13-07-ml.md @@ -1,6 +1,6 @@ --- -title: "Chapter 13.06: Information Theory for Machine Learning" -weight: 13006 +title: "Chapter 13.07: Information Theory for Machine Learning" +weight: 13007 --- In this section, we discuss how information-theoretic concepts are used in machine learning and demonstrate the equivalence of KL minimization and maximum likelihood maximization, as well as how (cross-)entropy can be used as a loss function. diff --git a/content/chapters/13_information_theory/13-07-mutual-info.md b/content/chapters/13_information_theory/13-08-mutual-info.md similarity index 85% rename from content/chapters/13_information_theory/13-07-mutual-info.md rename to content/chapters/13_information_theory/13-08-mutual-info.md index e563fa1d..2d5d6ea2 100644 --- a/content/chapters/13_information_theory/13-07-mutual-info.md +++ b/content/chapters/13_information_theory/13-08-mutual-info.md @@ -1,6 +1,6 @@ --- -title: "Chapter 13.07: Joint Entropy and Mutual Information" -weight: 13007 +title: "Chapter 13.08: Joint Entropy and Mutual Information" +weight: 13008 --- Information theory also provides means of quantifying relations between two random variables that extend the concept of (linear) correlation. We discuss joint entropy, conditional entropy, and mutual information in this context.