From 1eff9a39555a4fa8875c1b18d801e0ccfd7b6018 Mon Sep 17 00:00:00 2001 From: chriskolb <39267431+chriskolb@users.noreply.github.com> Date: Mon, 15 Jan 2024 17:19:20 +0100 Subject: [PATCH] update video links for information theory --- content/chapters/13_information_theory/13-01-entropy.md | 2 +- content/chapters/13_information_theory/13-02-entropy2.md | 2 +- content/chapters/13_information_theory/13-03-diffent.md | 2 +- content/chapters/13_information_theory/13-04-kl.md | 2 +- .../chapters/13_information_theory/13-05-cross-entropy-kld.md | 2 +- 5 files changed, 5 insertions(+), 5 deletions(-) diff --git a/content/chapters/13_information_theory/13-01-entropy.md b/content/chapters/13_information_theory/13-01-entropy.md index ac82b04..c8c4276 100644 --- a/content/chapters/13_information_theory/13-01-entropy.md +++ b/content/chapters/13_information_theory/13-01-entropy.md @@ -8,7 +8,7 @@ We introduce entropy, which expresses the expected information for discrete rand ### Lecture video -{{< video id="9H-DkQN0nxM" >}} +{{< video id="UWv2ZPnifvw" >}} ### Lecture slides diff --git a/content/chapters/13_information_theory/13-02-entropy2.md b/content/chapters/13_information_theory/13-02-entropy2.md index e985c53..ddfbe09 100644 --- a/content/chapters/13_information_theory/13-02-entropy2.md +++ b/content/chapters/13_information_theory/13-02-entropy2.md @@ -8,7 +8,7 @@ We continue our discussion about entropy and introduce joint entropy, the unique ### Lecture video -{{< video id="" >}} +{{< video id="aFYF459PE-w" >}} ### Lecture slides diff --git a/content/chapters/13_information_theory/13-03-diffent.md b/content/chapters/13_information_theory/13-03-diffent.md index 333806d..8ee20b3 100644 --- a/content/chapters/13_information_theory/13-03-diffent.md +++ b/content/chapters/13_information_theory/13-03-diffent.md @@ -8,7 +8,7 @@ In this section, we extend the definition of entropy to the continuous case. ### Lecture video -{{< video id="nTCwnRPLIsU" >}} +{{< video id="aeJzIzKNLWI" >}} ### Lecture slides diff --git a/content/chapters/13_information_theory/13-04-kl.md b/content/chapters/13_information_theory/13-04-kl.md index d894f02..3618d2d 100644 --- a/content/chapters/13_information_theory/13-04-kl.md +++ b/content/chapters/13_information_theory/13-04-kl.md @@ -8,7 +8,7 @@ The Kullback-Leibler divergence (KL) is an important quantity for measuring the ### Lecture video -{{< video id="kC0XXQgC4_k" >}} +{{< video id="7ZaY4fvuFg0" >}} ### Lecture slides diff --git a/content/chapters/13_information_theory/13-05-cross-entropy-kld.md b/content/chapters/13_information_theory/13-05-cross-entropy-kld.md index c35b872..e67a745 100644 --- a/content/chapters/13_information_theory/13-05-cross-entropy-kld.md +++ b/content/chapters/13_information_theory/13-05-cross-entropy-kld.md @@ -8,7 +8,7 @@ We introduce cross-entropy as a further information-theoretic concept and discus ### Lecture video -{{< video id="V5nYGjhRfY0" >}} +{{< video id="vtS6h0UYs4E" >}} ### Lecture slides