-
Notifications
You must be signed in to change notification settings - Fork 117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to generate Bible data to LLAMA? #18
Comments
Hi @paulocoutinhox! We may train HF version with Bible data:
modify hf-training-example.py
run training
After such a long training time, I hope, LLaMA model probably will be able to supply us with a new AI commandments. The prompt in hf-inference-example.py may be: "And " |
This is my other question, is this algorithm just for inference or can I use it for gpt chat style Bible questions? |
@paulocoutinhox chat is just an imitation of chat, really both are just inference, chat is just when 2048 last tokens of dialogue with LLaMA has been passed as prompt for further inference. You may ask questions to LLaMA using just prompt easily. I'll try to add HF chat example soon. |
@paulocoutinhox added chat example for HF version: https://github.com/randaller/llama-chat/blob/main/hf-chat-example.py |
Hi,
To a more real scenario, if i want input all the bible text into the LLAMA, how can i reach it?
Example of bible data:
https://raw.githubusercontent.com/tushortz/variety-bible-text/master/bibles/kjv.txt
Thanks.
The text was updated successfully, but these errors were encountered: