Skip to content

Commit

Permalink
Merge pull request #316 from slds-lmu/swap-gp
Browse files Browse the repository at this point in the history
remove chap 20 GP and re-add featsel as extra chapter
  • Loading branch information
chriskolb authored Oct 18, 2024
2 parents 754ec79 + a35facc commit 6a78473
Show file tree
Hide file tree
Showing 35 changed files with 116 additions and 117 deletions.
16 changes: 0 additions & 16 deletions content/chapters/20_gaussian_processes/20-01-bayes-lm.md

This file was deleted.

15 changes: 0 additions & 15 deletions content/chapters/20_gaussian_processes/20-02-basic.md

This file was deleted.

15 changes: 0 additions & 15 deletions content/chapters/20_gaussian_processes/20-03-covariance.md

This file was deleted.

15 changes: 0 additions & 15 deletions content/chapters/20_gaussian_processes/20-04-prediction.md

This file was deleted.

15 changes: 0 additions & 15 deletions content/chapters/20_gaussian_processes/20-05-training.md

This file was deleted.

4 changes: 0 additions & 4 deletions content/chapters/20_gaussian_processes/_index.md

This file was deleted.

Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 21.01: Introduction"
weight: 21001
title: "Chapter 20.01: Introduction"
weight: 20001
---
We define the phenomenon of imbalanced data sets and explain its consequences on accuarcy. Furthermore, we introduce some techniques for handling imbalanced data sets.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 21.02: Performance Measures"
weight: 21002
title: "Chapter 20.02: Performance Measures"
weight: 20002
---
We introduce performance measures other than accuracy and explain their advantages over accuracy for imbalanced date. In addition we introduce extensions of these measures for multiclass settings.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 21.05: Cost-Sensitive Learning 3"
weight: 21005
title: "Chapter 20.05: Cost-Sensitive Learning 3"
weight: 20005
---
We explain the concepts of instance specific costs and cost-sensitive OVO.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 21.06: Cost Curves 1"
weight: 21006
title: "Chapter 20.06: Cost Curves 1"
weight: 20006
---
We introduce cost curves for misclassif error and explain the duality between ROC points and cost lines.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 21.07: Cost Curves 2"
weight: 21007
title: "Chapter 20.07: Cost Curves 2"
weight: 20007
---
We explain cost curves with cost matrices and comparing classifiers. In addition we do a wrap-up comparision to ROC.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 21.08: Sampling Methods 1"
weight: 21008
title: "Chapter 20.08: Sampling Methods 1"
weight: 20008
---
We introduce the idea of sampling methods for dealing with imbalanced data. In addition, we explain certain undersampling techniques.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
---
title: "Chapter 21: Imbalanced Learning"
title: "Chapter 20: Imbalanced Learning"
---
This chapter introduces techniques for learning on imbalanced datasets.
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 22.01: Introduction"
weight: 22001
title: "Chapter 21.01: Introduction"
weight: 21001
---
In this chapter we emphasize the practical relevance of multi-target prediction problems. In addition, we name some special cases of multi-target prediction and establish the differences between transductive and inductive learning problems.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 22.02: Loss functions"
weight: 22002
title: "Chapter 21.02: Loss functions"
weight: 21002
---
In this chapter we introduce loss functions for multi-target prediction problems, explain the differences between instance-wise and decomposable losses and introduce the risk minimizer for both the hamming and 0/1 subset losses.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 22.03: Methods for Multi-target Prediction 1"
weight: 22003
title: "Chapter 21.03: Methods for Multi-target Prediction 1"
weight: 21003
---
In this chapter we introduce the concepts of independent models for targets, mean regularization, stacking and weight sharing in DL.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 22.04: Methods for Multi-target Prediction 2"
weight: 22004
title: "Chapter 21.04: Methods for Multi-target Prediction 2"
weight: 21004
---
In this chapter we introduce the Kronecker kernel ridge regression, graph relations in targets, probabilistic classifier chains and low-rank approximations.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
---
title: "Chapter 22: Multitarget Learning"
title: "Chapter 21: Multitarget Learning"
---
This chapter introduces multitarget learning techniques.
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 23.01: Introduction"
weight: 23001
title: "Chapter 22.01: Introduction"
weight: 22001
---
In this chapter we explain the differences between online and batch learning, the extended learning protocol in online learning and the strategies to measure performance in online learning.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 23.02: Simple Online Learning Algorithm"
weight: 23002
title: "Chapter 22.02: Simple Online Learning Algorithm"
weight: 22002
---
In this chapter we introduce the formalization of online learning algorithms and the FTL algorithm.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 23.03: Follow the Leader on OLO problems"
weight: 23003
title: "Chapter 22.03: Follow the Leader on OLO problems"
weight: 22003
---
In this chapter we introduce OLO problems and explain why some FTL might fail on these problems.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 23.04: Follow the regularized Leader"
weight: 23004
title: "Chapter 22.04: Follow the regularized Leader"
weight: 22004
---
In this chapter we introduce FTLR as a stable alternative to FTL.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 23.05: Follow the Leader on OQO problems"
weight: 23005
title: "Chapter 22.05: Follow the Leader on OQO problems"
weight: 22005
---
In this chapter we prove that FTL works for online quadratic problems.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 23.06: Online Convex optimization 1"
weight: 23006
title: "Chapter 22.06: Online Convex optimization 1"
weight: 22006
---
In this chapter we introduce the class of online convex optimization problems and derive the online gradient descent as a suitable learning algorithm for such cases.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Chapter 23.07: Online Convex optimization 2"
weight: 23007
title: "Chapter 22.07: Online Convex optimization 2"
weight: 22007
---
In this chapter we explain the connection between OGD and FTRL via linearization of convex functions and how this implies regret bounds for OGD.
<!--more-->
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
---
title: "Chapter 23: Online Learning"
title: "Chapter 22: Online Learning"
---
This chapter introduces online learning.
14 changes: 14 additions & 0 deletions content/chapters/30_feature_selection/30-01-introduction.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
---
title: "Chapter 30.01: Introduction"
weight: 30001
---
We motivate feature selection and discuss the difference to feature extraction.

<!--more-->
### Lecture video

{{< video id="xiVB1EmlU9A" >}}

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-fs-introduction.pdf" >}}
15 changes: 15 additions & 0 deletions content/chapters/30_feature_selection/30-02-motivating-examples.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
title: "Chapter 30.02: Motivating Examples"
weight: 30002
---
In this section, we explain the practical importance of feature selection and show that models with
integrated selection do not always work.

<!--more-->
### Lecture video

{{< video id="1BwgTptjDs4" >}}

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-fs-motivating-examples.pdf" >}}
15 changes: 15 additions & 0 deletions content/chapters/30_feature_selection/30-03-filters1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
title: "Chapter 30.03: Filter Methods I"
weight: 30003
---
We introduce how filter methods work and how they can be used for feature selection.

<!--more-->

### Lecture video

{{< video id="RcDyvExpCSg" >}}

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-fs-filters1.pdf" >}}
15 changes: 15 additions & 0 deletions content/chapters/30_feature_selection/30-04-filters2.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
title: "Chapter 30.04: Filter Methods II (Examples and Caveats)"
weight: 30004
---
In this section, we discuss how filter methods can be misleading and show how they work in practical applications.

<!--more-->

### Lecture video

{{< video id="X3FpzGnGA7o" >}}

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-fs-filters2.pdf" >}}
15 changes: 15 additions & 0 deletions content/chapters/30_feature_selection/30-05-wrapper.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
---
title: "Chapter 30.05: Wrapper Methods"
weight: 30005
---
This section explains wrapper methods and explains how they can aid feature selection.

<!--more-->

### Lecture video

{{< video id="XmvlHUCGNbc" >}}

### Lecture slides

{{< pdfjs file="https://github.com/slds-lmu/lecture_sl/raw/main/slides-pdf/slides-fs-wrapper.pdf" >}}
5 changes: 5 additions & 0 deletions content/chapters/30_feature_selection/_index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
title: "Extra Chapter: Feature Selection"
---
This chapter introduces feature selection, i.e., dinding a well-performing, hopefully small set of
features for a task.

0 comments on commit 6a78473

Please sign in to comment.