Skip to content

Commit

Permalink
Update collections/_posts/2025-01-18-GradientStepDenoiser.md
Browse files Browse the repository at this point in the history
Co-authored-by: Pierre Rougé <[email protected]>
  • Loading branch information
Tmodrzyk and PierreRouge authored Jan 23, 2025
1 parent 4004ecd commit 0552f47
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion collections/_posts/2025-01-18-GradientStepDenoiser.md
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,7 @@ What we really want is the best of both worlds:
- good reconstructions using neural networks
- some constraints with respect to the observations

Plug-and-Play methods are exactly this compromise. They use the traditionnal variational formulation of inverse problems and replace the hand-crafted regularization $g$ by a Gaussian denoiser $$D_\sigma$$.
Plug-and-Play methods are exactly this compromise. They use the traditionnal variational formulation of inverse problems and replace the hand-crafted regularization $$g$$ by a Gaussian denoiser $$D_\sigma$$.
Let's take the forward-backward splitting: it's Plug-and-Play version now simply becomes:

$$x^{n+1} = D_\sigma \circ \left( \text{Id} - \tau \nabla f \right) (x^{n})$$
Expand Down

0 comments on commit 0552f47

Please sign in to comment.