Skip to content

Commit

Permalink
Add aux to Going Bayesian doc
Browse files Browse the repository at this point in the history
  • Loading branch information
SamDuffield committed Nov 5, 2024
1 parent 44b96d6 commit e8c2501
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion docs/log_posteriors.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,9 +98,12 @@ either $N$ or $n$ increaase.
log_prior = diag_normal_log_prob(params, sd=1., normalize=False)
mean_log_lik = Categorical(logits=logits).log_prob(batch['labels']).mean()
mean_log_post = log_prior / num_data + mean_log_lik
return mean_log_post
return mean_log_post, torch.tensor([])
```

See [auxiliary information](#auxiliary-information) for why we return an
additional empty tensor.

The issue with running Bayesian methods (such as VI or SGHMC) on this mean log posterior
function is that naive application will result in approximating the tempered posterior

Expand Down

0 comments on commit e8c2501

Please sign in to comment.