Precision result different from that obtained with sklearn #1041
Unanswered
camilomarino
asked this question in
Classification
Replies: 3 comments
-
Did you rectified the problem |
Beta Was this translation helpful? Give feedback.
0 replies
-
Did you find why? I am facing the same issue with regression metrics. |
Beta Was this translation helpful? Give feedback.
0 replies
-
with 1.4 version it seems to be correct: import torch
from torchmetrics.functional import precision as tm_precision
from sklearn.metrics import precision_score as sklearn_precision
preds = torch.tensor([0, 0, 1, 0, 0], dtype=torch.int)
target = torch.tensor([0, 0, 1, 1, 1], dtype=torch.int)
print("TM: ", tm_precision(preds=preds, target=target, task="binary"))
print("SK: ", sklearn_precision(y_true=target, y_pred=preds))
preds = torch.tensor([0, 0, 1, 0, 0], dtype=torch.float)
target = torch.tensor([0, 0, 1, 1, 1], dtype=torch.int)
print("TM: ", tm_precision(preds=preds, target=target, task="binary"))
print("SK: ", sklearn_precision(y_true=target, y_pred=preds)) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I wanted to know why when calculating "precision" when the "preds" are of type int I get different results than when using sklearn or doing the math on paper.
Below is a code snippet that shows more clearly what my query is:
Result:
Beta Was this translation helpful? Give feedback.
All reactions