Skip to content

Commit

Permalink
fix ref name
Browse files Browse the repository at this point in the history
  • Loading branch information
jmeziere committed Aug 24, 2024
1 parent 9e9c7bf commit 95b8d19
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions src/Operators.jl
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ support constraint to the object, then returnns.
Gradient descent is an alternate way to view the ER algorithm becausee
ER is equivalent to gradient descent with a step size of 0.5.
More information about the ER algorithm can be found in [Fienup1978](@cite) and [Marchesini2007](@cite).
More information about the ER algorithm can be found in [Fienup1978,Marchesini2007](@cite).
"""
struct ER <: Operator
end
Expand Down Expand Up @@ -61,7 +61,7 @@ ER(\\rho_i) & \\rho \\in support \\\\
Marchesini [Marchesini2007](@cite) has shown that the HIO algorithm is
equivalent to a mini-max problem.
More information about the HIO algorithm can be found in [Fienup1978](@cite) and [Marchesini2007](@cite).
More information about the HIO algorithm can be found in [Fienup1978,Marchesini2007](@cite).
```
"""
struct HIO <: Operator
Expand All @@ -84,7 +84,7 @@ using `sigma` as the width of the Gaussian. The support is then created
from everything above the `threshold` times maximum value of the blurred
object.
Further information about the shrinkwrap algorithm can be found in [Marchesini2003aa](@cite)
Further information about the shrinkwrap algorithm can be found in [Marchesini2003a](@cite)
"""
struct Shrink{T} <: Operator
threshold::Float64
Expand Down

0 comments on commit 95b8d19

Please sign in to comment.