Generate a new MSA #67
-
Hi, congratulations for your fantastic work ! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 6 replies
-
Yes, you'd substitute the amino acids with
You want to look at the output of the forward pass' |
Beta Was this translation helpful? Give feedback.
Yes, you'd substitute the amino acids with
alphabet.mask_idx
at the desired positions.You want to look at the output of the forward pass'
logits
field, which is not the same as the embedding (rather the relationship is:logits = embedding @ model. embed_out
). But yes, to get a distribution over the tokens the model predicts to be at that position, you'd want to doF.softmax(logits)
-- it is likely that the original token will have high probability under the model, that's how it was trained after all.