From 60a762a8e81b5325b716b70348a34a57c3071ae6 Mon Sep 17 00:00:00 2001 From: Santiago Castro Date: Thu, 25 May 2023 13:58:37 -0400 Subject: [PATCH] Fix typo in a expression from the Soft-Nearest Neighbors Loss in the CL post --- posts/2021-05-31-contrastive/index.html | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/posts/2021-05-31-contrastive/index.html b/posts/2021-05-31-contrastive/index.html index dd8a54a..0b50753 100644 --- a/posts/2021-05-31-contrastive/index.html +++ b/posts/2021-05-31-contrastive/index.html @@ -486,7 +486,7 @@

InfoNCE

Soft-Nearest Neighbors Loss (Salakhutdinov & Hinton 2007, Frosst et al. 2019) extends it to include multiple positive samples.

-

Given a batch of samples, $\{\mathbf{x}_i, y_i)\}^B_{i=1}$ where $y_i$ is the class label of $\mathbf{x}_i$ and a function $f(.,.)$ for measuring similarity between two inputs, the soft nearest neighbor loss at temperature $\tau$ is defined as:

+

Given a batch of samples, $\{(\mathbf{x}_i, y_i)\}^B_{i=1}$ where $y_i$ is the class label of $\mathbf{x}_i$ and a function $f(.,.)$ for measuring similarity between two inputs, the soft nearest neighbor loss at temperature $\tau$ is defined as:

$$ \mathcal{L}_\text{snn} = -\frac{1}{B}\sum_{i=1}^B \log \frac{\sum_{i\neq j, y_i = y_j, j=1,\dots,B} \exp(- f(\mathbf{x}_i, \mathbf{x}_j) / \tau)}{\sum_{i\neq k, k=1,\dots,B} \exp(- f(\mathbf{x}_i, \mathbf{x}_k) /\tau)}