Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Eccentricity computation differences with skimage #31

Open
joseantonio6894 opened this issue Mar 14, 2023 · 2 comments
Open

Eccentricity computation differences with skimage #31

joseantonio6894 opened this issue Mar 14, 2023 · 2 comments

Comments

@joseantonio6894
Copy link

joseantonio6894 commented Mar 14, 2023

Hi!

I was inspecting the eccentricity property for connected regions and found some discrepancies in the results compared to scikit-image (I was indeed trying to avoid a Python call with this package). For example, for this very simple region (which is a straight line, a fully elongated ellipse):

a = [0 0 0 0 0; 0 0 1 0 0; 0 1 0 0 0; 0 0 0 0 0; 0 0 0 0 0]
5×5 Matrix{Int64}:
 0  0  0  0  0
 0  0  1  0  0
 0  1  0  0  0
 0  0  0  0  0
 0  0  0  0  0

a_np = pyjl(a).to_numpy()
Python ndarray:
array([[0, 0, 0, 0, 0],
       [0, 0, 1, 0, 0],
       [0, 1, 0, 0, 0],
       [0, 0, 0, 0, 0],
       [0, 0, 0, 0, 0]])

region_props = skimage.measure.regionprops(a_np)
Python list: [<skimage.measure._regionprops.RegionProperties object at 0x7fdd988eb7d0>]

eccentricity_skimage = pyconvert(Float64, region_props[0].eccentricity)
1.0

eccentricity_jl = analyze_components(a, EllipseRegion()).eccentricity[1]
0.925820099772551

Results differ almost 10%. Additionally, if we rotate this line 45º, skimage eccentricity remains constant (as expected), but here the result is even more different:

a = [0 0 0 0 0; 0 0 1 0 0; 0 1 0 0 0; 0 0 0 0 0; 0 0 0 0 0]
5×5 Matrix{Int64}:
 0  0  0  0  0
 0  0  0  0  0
 0  1  1  0  0
 0  0  0  0  0
 0  0  0  0  0

a_np = pyjl(a).to_numpy()
Python ndarray:
array([[0, 0, 0, 0, 0],
       [0, 0, 0, 0, 0],
       [0, 1, 1, 0, 0],
       [0, 0, 0, 0, 0],
       [0, 0, 0, 0, 0]])

region_props = skimage.measure.regionprops(a_np)
Python list: [<skimage.measure._regionprops.RegionProperties object at 0x7fdd988eb7d0>]

eccentricity_skimage = pyconvert(Float64, region_props[0].eccentricity)
1.0

eccentricity_jl = analyze_components(a, EllipseRegion()).eccentricity[1]
0.8660254037844379

I've observed that there are some differences in the process to calculate the eccentricity, but I guess results should be somehow similar?

Thanks!

@zygmuntszpak
Copy link
Owner

Hi,

Thank you for taking the time to raise the issue and providing feedback. To be honest, I'm actually surprised and intrigued by the SciPy results. I would not necessarily have expected an eccentricity of 1.

Suppose one took a picture of a bright light source against a dark background (imagine an ellipse-shaped torch in a dark room), then the way in which this ellipse appears in the image depends on the image formation process and involves the camera resolution, pixel size, quantization of the photons etc. If you were to model a pixel as a little rectangle, then the amount of light hitting a pixel would correspond to the area of intersection between the pixel and the ellipse region. If you kept everything fixed and just rotated your torch then your picture would change in some subtle ways (it would not just be a simple rotation of the previous picture) because the intersection and area of overlap between your little rectangles and the ellipse would change in subtle ways at the border of the ellipse. The degree to which your image would change would also depend on how large your pixels were (i.e your spatial resolution). You could imagine changing the aspect ratio of your ellipse somewhat and not actually observing any difference in the image you acquired based on your quantization and pixel size. Conversely, you could imagine rotating your elliptic torch somewhat but actually observing quite a significant change in the generated image (if you have poor spatial resolution for instance).

Normally, we don't think about such image formation issues. But they become salient whenever you try to infer something about a continuous entity (such as an ellipse region) from quantized discrete samples.

If you just observe a line segment in an image such as the example you gave, you don't actually know precisely what the eccentricity of the underlying ellipse is because depending on the assumptions you make about the image formation process, a whole bunch of very eccentric ellipses would give rise to exactly the same image.

I implemented the ellipse region estimation based on the modelling outlined in the following paper:

  1. [1] M. R. Teague, “Image analysis via the general theory of moments*,” Journal of the Optical Society of America, vol. 70, no. 8, p. 920, Aug. 1980.

I'm genuinely curious to understand how this differs from what SciPy implemented. I had a quick glance over their code, and it also seems to use second order moments. We'd have to dig in an compare what values they get for their moments, and compare it with what we get here.

Perhaps you could also post what values they get for the semi-major, semi-minor axes? They must be getting different values from us.

I believe Matlab implements the same algorithm as I have done here under the hood. I ran the example you gave through Matlab and get exactly the same values as what my code produces:

test_image_1 = [0 0 0 0 0;
                0 0 1 0 0;
                0 1 0 0 0;
                0 0 0 0 0;
                0 0 0 0 0]

test_image_2 = [0 0 0 0 0;
                0 0 0 0 0;
                0 1 1 0 0;
                0 0 0 0 0;
                0 0 0 0 0]
            
            
   
image_list = {test_image_1, test_image_2}
   
  for i = 1:length(image_list)  
    sprintf("Image %i \n",i)
    cc = bwconncomp(image_list{i});  
    stats = regionprops(cc, 'MajorAxisLength','MinorAxisLength','Eccentricity','Orientation','Centroid');
    sprintf("Centroid %f \n",stats.Centroid);
    sprintf("MinorAxisLength %f \n",stats.MinorAxisLength)
    sprintf("MajorAxisLength %f \n",stats.MajorAxisLength)
    sprintf("Eccentricity %f \n",stats.Eccentricity)
    sprintf("Orientation %f \n",stats.Orientation);   
      
  end
ans = 

    "Image 1 
     "


ans = 

    "MinorAxisLength 1.154701 
     "


ans = 

    "MajorAxisLength 3.055050 
     "


ans = 

    "Eccentricity 0.925820 
     "


ans = 

    "Image 2 
     "


ans = 

    "MinorAxisLength 1.154701 
     "


ans = 

    "MajorAxisLength 2.309401 
     "


ans = 

    "Eccentricity 0.866025 
     "

I'll try to find out what Mathematica would produce for this particular example as well.

@joseantonio6894
Copy link
Author

Hi,

Thanks for the detailed explanation. I found it quite instructive and, in fact, I totally see your point on this regard. However, for these examples of rotated straight lines, I would still expect an eccentricity equal to 1, independently of the formation process. As you mentioned, there may be many factors affecting the image formation, but, given we don't/can't know them, assuming the scene is different from what is captured in the image can be misleading, since it could be different in many forms or it could be precisely as it has been captured. Additionally, when working with synthetic data and generating a line, the result should be consistent, in my opinion. And a difference of almost 0.15 in the eccentricity for a theoretically straight line is too much, from my perspective. Anyhow, it may be that my point of view is wrong given my limited knowledge on the topic, or maybe both approaches could be valid, which would mean this is not a real issue.

Regarding the MinorAxisLength and MajorAxisLength values, they in fact differ. For Image 1, the axis are [2.83, 0.0] and, for Image 2, [2.0, 0.0], so scikit is estimating the ellipse as a pure line. And, in fact, the moments matrices, which are the starting point for the rest of computations, are slightly different, if I'm interpreting the results correctly (the implementation is totally different, although I cannot assure they're equivalent in the end):

>>> test_image_1 = np.array([[0,0,0,0,0],[0,0,1,0,0],[0,1,0,0,0],[0,0,0,0,0],[0,0,0,0,0]])
>>> test_image_1
array([[0, 0, 0, 0, 0],
       [0, 0, 1, 0, 0],
       [0, 1, 0, 0, 0],
       [0, 0, 0, 0, 0],
       [0, 0, 0, 0, 0]])
>>> skimage.measure.moments(test_image_1)
array([[ 2.,  3.,  5.,  9.],
       [ 3.,  4.,  6., 10.],
       [ 5.,  6.,  8., 12.],
       [ 9., 10., 12., 16.]])
julia> test_image_1 = [0 0 0 0 0; 0 0 1 0 0; 0 1 0 0 0; 0 0 0 0 0; 0 0 0 0 0]
5×5 Matrix{Int64}:
 0  0  0  0  0
 0  0  1  0  0
 0  1  0  0  0
 0  0  0  0  0
 0  0  0  0  0
julia> measurements_1 = analyze_components(test_image_1, EllipseRegion())
1×11 DataFrame
 Row │ l      M₀₀      M₁₀      M₀₁      M₁₁      M₂₀      M₀₂      centroid    semiaxes            orientation  eccentricity 
     │ Int64  Float64  Float64  Float64  Float64  Float64  Float64  SArray…     SArray…             Float64      Float64      
─────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
   1 │     1      2.0      5.0      5.0     12.0  13.1667  13.1667  [2.5, 2.5]  [0.57735, 1.52753]        -45.0       0.92582
>>> test_image_2 = np.array([[0,0,0,0,0],[0,0,0,0,0],[0,1,1,0,0],[0,0,0,0,0],[0,0,0,0,0]])
>>> test_image_2
array([[0, 0, 0, 0, 0],
       [0, 0, 0, 0, 0],
       [0, 1, 1, 0, 0],
       [0, 0, 0, 0, 0],
       [0, 0, 0, 0, 0]])
>>> skimage.measure.moments(test_image_2)
array([[ 2.,  3.,  5.,  9.],
       [ 4.,  6., 10., 18.],
       [ 8., 12., 20., 36.],
       [16., 24., 40., 72.]])
julia> test_image_2 = [0 0 0 0 0; 0 0 0 0 0; 0 1 1 0 0; 0 0 0 0 0; 0 0 0 0 0]
5×5 Matrix{Int64}:
 0  0  0  0  0
 0  0  0  0  0
 0  1  1  0  0
 0  0  0  0  0
 0  0  0  0  0
julia> measurements_2 = analyze_components(test_image_2, EllipseRegion())
1×11 DataFrame
 Row │ l      M₀₀      M₁₀      M₀₁      M₁₁      M₂₀      M₀₂      centroid    semiaxes           orientation  eccentricity 
     │ Int64  Float64  Float64  Float64  Float64  Float64  Float64  SArray…     SArray…            Float64      Float64      
─────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
   1 │     1      2.0      5.0      6.0     15.0  13.1667  18.1667  [3.0, 2.5]  [0.57735, 1.1547]          0.0      0.866025

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants