-
Notifications
You must be signed in to change notification settings - Fork 5
Torsion angle loss #81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
from ._torsion_angle_loss import torsion_angle_loss | ||
|
||
__all__ = [ | ||
"torsion_angle_loss", | ||
] |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,36 @@ | ||
from typing import Tuple | ||
|
||
import torch | ||
from torch import Tensor | ||
|
||
|
||
def torsion_angle_loss(input, target: Tuple[Tensor, Tensor]) -> Tensor: | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can we call this something like There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Of course. I will add a docstring! As far as renaming, are there multiple torsion angle losses? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think you can imagine doing some different things. If you predict an angle instead of a 2d vector then the anglenorm part of this doesn't make sense and you would just do cosine of the difference. You could also do something like nll of wrapped normal. |
||
""" | ||
|
||
Parameters | ||
---------- | ||
input | ||
target | ||
|
||
Returns | ||
------- | ||
|
||
""" | ||
a = input / torch.norm(input, dim=-1, keepdim=True) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Should this be clamped? 🤔 |
||
|
||
b, c = target | ||
|
||
x = torch.mean( | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can we call |
||
torch.minimum( | ||
torch.square(torch.norm(a - b, dim=-1)), | ||
torch.square(torch.norm(a - c, dim=-1)), | ||
), | ||
dim=[-1, -2], | ||
) | ||
|
||
y = torch.mean( | ||
torch.abs(torch.norm(input, dim=-1) - 1), | ||
dim=[-1, -2], | ||
) | ||
|
||
return x + 0.02 * y |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,17 @@ | ||
import beignet.nn.functional | ||
import torch | ||
|
||
|
||
def test_torsion_angle_loss(): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @kleinhenz Just a smoke test. I will revisit with improved unit tests after I implement the scaffolding for all the losses. |
||
input = torch.ones([1, 1, 7, 2]) | ||
|
||
target = torch.zeros([1, 1, 7, 2]), torch.zeros([1, 1, 7, 2]) | ||
|
||
output = beignet.nn.functional.torsion_angle_loss(input, target) | ||
|
||
torch.testing.assert_close( | ||
output, | ||
torch.tensor([1.0]), | ||
rtol=0.01, | ||
atol=0.01, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could use PEP 585 instead so the hint would be
tuple[Tensor, Tensor]
. I find it slightly cleaner but not a big deal either way.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking into this more it looks like those hints are actually being deprecated and removed once python 3.9 is EOL