Skip to content
This repository has been archived by the owner on Aug 1, 2024. It is now read-only.

Why the inpu size of embedding is 33? #679

Open
fulacse opened this issue Apr 19, 2024 · 1 comment
Open

Why the inpu size of embedding is 33? #679

fulacse opened this issue Apr 19, 2024 · 1 comment

Comments

@fulacse
Copy link

fulacse commented Apr 19, 2024

I want do transfer leaning with the embedding. So I need the vocabulary to token table. Thanks!

@garykbrixi
Copy link

The token table can be accessed from the Alphabet object. Try alphabet.to_dict()

https://github.com/facebookresearch/esm/blob/2b369911bb5b4b0dda914521b9475cad1656b2ac/esm/data.py#L133C2-L134C38

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants