From e8c2501608f6db53ddc78ad48b4aa7874e446b53 Mon Sep 17 00:00:00 2001
From: Sam Duffield <s@mduffield.com>
Date: Tue, 5 Nov 2024 11:36:24 +0000
Subject: [PATCH] Add aux to Going Bayesian doc

---
 docs/log_posteriors.md | 5 ++++-
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/docs/log_posteriors.md b/docs/log_posteriors.md
index 1d46ada1..df7d0bb2 100644
--- a/docs/log_posteriors.md
+++ b/docs/log_posteriors.md
@@ -98,9 +98,12 @@ either $N$ or $n$ increaase.
         log_prior = diag_normal_log_prob(params, sd=1., normalize=False)
         mean_log_lik = Categorical(logits=logits).log_prob(batch['labels']).mean()
         mean_log_post = log_prior / num_data + mean_log_lik
-        return mean_log_post
+        return mean_log_post, torch.tensor([])
     ```
 
+    See [auxiliary information](#auxiliary-information) for why we return an
+    additional empty tensor.
+
 The issue with running Bayesian methods (such as VI or SGHMC) on this mean log posterior
 function is that naive application will result in approximating the tempered posterior