Replies: 1 comment
-
Hey, we just integrated better Looking at your issue however, did you perhaps reuse llama2 tokenizer for llama3? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi guys, i have customized the https://github.com/OpenAccess-AI-Collective/axolotl/blob/main/src/axolotl/prompt_strategies/llama2_chat.py to work with my own chat template. Now i want to adapt it to llama3, what should i change to work with it? Right now, if i'm using the llama2_chat directly, it will have problem in the image below. Do you guys have any suggestions?
![image](https://private-user-images.githubusercontent.com/33374938/344997301-8a47ec17-fe99-4048-afdf-9701302b1dfc.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzkyMzI3NTIsIm5iZiI6MTczOTIzMjQ1MiwicGF0aCI6Ii8zMzM3NDkzOC8zNDQ5OTczMDEtOGE0N2VjMTctZmU5OS00MDQ4LWFmZGYtOTcwMTMwMmIxZGZjLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMTElMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjExVDAwMDczMlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWQzMjU0NmZmOTdhZjA5NDI2ODhjMjdkMGY3ZDlhNDhmYWRjOTg2MWZmNWM1YTRmZTlhNzFlOWY4N2ZkNGZkNTkmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.h8r1wvAGQBNf4YLEVimNygLYXyahn45Li60Rrpq7LrU)
Beta Was this translation helpful? Give feedback.
All reactions