Skip to content
This repository has been archived by the owner on Aug 1, 2024. It is now read-only.

Generate a new MSA #67

Answered by tomsercu
damiano-sg asked this question in Q&A
Apr 20, 2021 · 1 comments · 6 replies
Discussion options

You must be logged in to vote

how to give a masked input to the model

Yes, you'd substitute the amino acids with alphabet.mask_idx at the desired positions.

once I get the output embeddings, the standard metod to transform them into the original tokens is to just softmax the logits

You want to look at the output of the forward pass' logits field, which is not the same as the embedding (rather the relationship is: logits = embedding @ model. embed_out). But yes, to get a distribution over the tokens the model predicts to be at that position, you'd want to do F.softmax(logits) -- it is likely that the original token will have high probability under the model, that's how it was trained after all.

Replies: 1 comment 6 replies

Comment options

You must be logged in to vote
6 replies
@tomsercu
Comment options

@damiano-sg
Comment options

@damiano-sg
Comment options

@tomsercu
Comment options

@damiano-sg
Comment options

Answer selected by tomsercu
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants