Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hashsum of label models inconsistent between runs #156

Open
hlibbabii opened this issue May 27, 2021 · 0 comments
Open

Hashsum of label models inconsistent between runs #156

hlibbabii opened this issue May 27, 2021 · 0 comments
Labels
bug Something isn't working p1-important

Comments

@hlibbabii
Copy link
Member

Commit example:
giganticode/bohr@3dfd353

Even if the dependencies of the label model training stage don't change, rerunning the training reruns in a label model that has a different hash. The root cause is that when the model is serialized, some of its attribute (pytorch tensors) are serialized into a different sequence of bytes every time, still not sure about the reason. So I asked a question about this:
https://discuss.pytorch.org/t/tensor-pickling-inconsistent-between-runs/122533
https://stackoverflow.com/questions/67710297/pytorch-tensor-pickling-inconsistent-between-runs

@hlibbabii hlibbabii added bug Something isn't working p1-important labels May 27, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working p1-important
Projects
None yet
Development

No branches or pull requests

1 participant