You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, first thanks for the great tool. Seems to be designed to do exactly what I'm looking to do!
However, I think there is some sort of odd bug with the False Positive detection. I attach an example excerpt of some segmentations:
As you can see the initial segmentation has some small spotty detections in the top right that aren't in the ground truth. This seems to show up in the segmentation metrics with n_true_labels 32 and n_pred_labels 40, however n_false_positives is only 2. Running area_opening to remove these as shown below and then re-running umetrics correctly reduces the number of predicted labels bringing it closer to the number of true labels but the false positives don't change and nor does the Jaccard index. I am correctly passing the ground to umetrics.calculate first and then the prediction second. Interestingly swapping them around seems to make things work as expected but they are incorrectly called false negatives etc.
Any idea what's going on here?
Many thanks,
Craig
The text was updated successfully, but these errors were encountered:
Hi, first thanks for the great tool. Seems to be designed to do exactly what I'm looking to do!
However, I think there is some sort of odd bug with the False Positive detection. I attach an example excerpt of some segmentations:
As you can see the initial segmentation has some small spotty detections in the top right that aren't in the ground truth. This seems to show up in the segmentation metrics with n_true_labels 32 and n_pred_labels 40, however n_false_positives is only 2. Running area_opening to remove these as shown below and then re-running umetrics correctly reduces the number of predicted labels bringing it closer to the number of true labels but the false positives don't change and nor does the Jaccard index. I am correctly passing the ground to umetrics.calculate first and then the prediction second. Interestingly swapping them around seems to make things work as expected but they are incorrectly called false negatives etc.
Any idea what's going on here?
Many thanks,
Craig
The text was updated successfully, but these errors were encountered: