-
Notifications
You must be signed in to change notification settings - Fork 322
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem with tokenizer? #63
Comments
Reverting to transformers==4.33.0 resolved the problem. |
Python version too high,switch to 3.10 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I am writing to ask for your help with a problem I am having with the tokenizer. I have been trying to solve it for a while now, but I have been unsuccessful.
However, I am having trouble with : Traceback (most recent call last):
File "/content/baby-llama2-chinese/eval.py", line 81, in
tokenizer=ChatGLMTokenizer(vocab_file='./chatglm_tokenizer/tokenizer.model')
File "/content/baby-llama2-chinese/chatglm_tokenizer/tokenization_chatglm.py", line 68, in init
super().init(padding_side=padding_side, clean_up_tokenization_spaces=clean_up_tokenization_spaces, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/tokenization_utils.py", line 367, in init
self._add_tokens(
File "/usr/local/lib/python3.10/dist-packages/transformers/tokenization_utils.py", line 467, in _add_tokens
current_vocab = self.get_vocab().copy()
File "/content/baby-llama2-chinese/chatglm_tokenizer/tokenization_chatglm.py", line 112, in get_vocab
vocab = {self._convert_id_to_token(i): i for i in range(self.vocab_size)}
File "/content/baby-llama2-chinese/chatglm_tokenizer/tokenization_chatglm.py", line 107, in vocab_size
return self.tokenizer.n_words
AttributeError: 'ChatGLMTokenizer' object has no attribute 'tokenizer'. Did you mean: 'tokenize'?
I would be very grateful if you could help me solve this problem. I am available to answer any questions you may have.
Thank you for your time and consideration.
The text was updated successfully, but these errors were encountered: