Skip to content

Commit

Permalink
Use expm1 for log-normal transforms
Browse files Browse the repository at this point in the history
Summary: Replacing `exp(x) - 1` with the numerically more stable `expm1`.

Differential Revision: D62789111
  • Loading branch information
SebastianAment authored and facebook-github-bot committed Sep 17, 2024
1 parent e9ce11f commit 081b979
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions botorch/models/transforms/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ def norm_to_lognorm(mu: Tensor, Cov: Tensor) -> tuple[Tensor, Tensor]:
diag = torch.diagonal(Cov, dim1=-1, dim2=-2)
b = mu + 0.5 * diag
mu_ln = torch.exp(b)
Cov_ln = (torch.exp(Cov) - 1) * torch.exp(b.unsqueeze(-1) + b.unsqueeze(-2))
Cov_ln = torch.special.expm1(Cov) * torch.exp(b.unsqueeze(-1) + b.unsqueeze(-2))
return mu_ln, Cov_ln


Expand Down Expand Up @@ -88,7 +88,7 @@ def norm_to_lognorm_variance(mu: Tensor, var: Tensor) -> Tensor:
The `batch_shape x n` variance vector of the log-Normal distribution.
"""
b = mu + 0.5 * var
return (torch.exp(var) - 1) * torch.exp(2 * b)
return torch.special.expm1(var) * torch.exp(2 * b)


def expand_and_copy_tensor(X: Tensor, batch_shape: torch.Size) -> Tensor:
Expand Down

0 comments on commit 081b979

Please sign in to comment.