This repository has been archived by the owner on Aug 1, 2024. It is now read-only.
Replies: 2 comments
-
These weights, and the computation, are super minimal so they shouldn't be in the way from a GPU memory perspective. Just |
Beta Was this translation helpful? Give feedback.
0 replies
-
Great, thanks |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I note that on loading the msa-transformer model/weights that the layers and weights for both for
contact_head.regression.weight
andcontact_head.regression.bias
are present if I list the layers. Do I need to remove these layers from the model to use it without the contact head or is it sufficient to just setreturn_contacts=False
when running a forward pass of an instance of MSATransformer?Beta Was this translation helpful? Give feedback.
All reactions