-
Notifications
You must be signed in to change notification settings - Fork 195
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LogitsConfig.__init__() got an unexpected keyword argument 'ith_hidden_layer' #186
Comments
After spending several hours to compare the source codes from pip install esm with this repo, I found that if directly using
to install esm3, it will cause the problem because the esm pip package seems have not been updated yet.
For the single sequence using ESMC_6B embedding, you can use the following example # Modify the LogitsConfig and model setting to ESMC_6B first
from esm.sdk import client
model = client(
model="esmc-6b-2024-12", url="https://forge.evolutionaryscale.ai", token=YOUR_TOKEN
)
# Suppose you want to extract the embedding of the last hidden layer 80 in ESMC_6B
ESMC_6B_EMBEDDING_CONFIG = LogitsConfig(sequence=True, return_embeddings=True, return_hidden_states=True, ith_hidden_layer=79)
def embed_sequence(model: ESM3InferenceClient, sequence: str) -> LogitsOutput:
# I found the error message of ESMC is somehow difficult to find, so I directly print all the variables
protein = ESMProtein(sequence=sequence)
#print(protein)
protein_tensor = model.encode(protein)
#print(protein_tensor)
output = model.logits(protein_tensor, ESMC_6B_EMBEDDING_CONFIG)
#print(output)
return output
sequence="AAAAA"
logits_output = embed_sequence(model, sequence)
#print(logits_output.logits, logits_output.embeddings, logits_output.hidden_states)
# Check if the hidden_states can be successfully extracted
print(logits_output.hidden_states) I think this will solve your problem. Cheers, |
Thank you for the detailed instructions! The issue was resolved after I I really appreciate your help! |
Happy to help! 🎉🎉🎉 |
How can I extract ESMC_6B embedding of sequences?
I tried to extract protein embedding, following the instructions in https://github.com/evolutionaryscale/esm/blob/main/cookbook/tutorials/2_embed.ipynb.
TypeError Traceback (most recent call last)
Cell In[40], line 1
----> 1 ESMC_6B_EMBEDDING_CONFIG = LogitsConfig(return_hidden_states=True, ith_hidden_layer=55)
TypeError: LogitsConfig.init() got an unexpected keyword argument 'ith_hidden_layer'
The text was updated successfully, but these errors were encountered: