Skip to content

Commit

Permalink
Update collections/_posts/2025-01-18-GradientStepDenoiser.md
Browse files Browse the repository at this point in the history
Co-authored-by: Nathan Painchaud <[email protected]>
  • Loading branch information
Tmodrzyk and nathanpainchaud authored Jan 23, 2025
1 parent 330b293 commit 9de0d5f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion collections/_posts/2025-01-18-GradientStepDenoiser.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ Let us consider an ill-posed linear inverse problem, for instance super-resoluti

However reconstruction methods that use **only** neural networks also have a lot of drawbacks:
- no data-fidelity constraints, meaning that we have no theoretical guarantees on their performances
- very sensitive to domain-shift, so applying them to datas way out of their training distribution is risky
- very sensitive to domain-shift, so applying them to data way out of their training distribution is risky
- typically require to be re-trained as soon as the degradation operator $$A$$ changes

What we really want is the best of both worlds:
Expand Down

0 comments on commit 9de0d5f

Please sign in to comment.