-
Notifications
You must be signed in to change notification settings - Fork 376
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error while performing Inference with llama-3.2 3B checkpoint #1749
Comments
Hey @Vattikondadheeraj. One quick thing to try, could you change I'd also reccomend giving our |
Hey @SalmanMohammadi , I tested on generate_v2 file but I am getting another error which is related to missing thing in config file I guess. I attached the error below
|
Hey,
I am trying to perform inference with llama-3.2 3B Instruct checkpoint but I am facing some errors. I didn't go deep into the code or try to debug yet. I am posting here in a hope to get quick resolution. I have attached the error below
Heres in my config file
The text was updated successfully, but these errors were encountered: