Skip to content

Latest commit

 

History

History
24 lines (18 loc) · 1.01 KB

README.md

File metadata and controls

24 lines (18 loc) · 1.01 KB

Notice: This code built on llama2.c for Inference deployment of the llama3, where convert-tokenizer-llama3.py is distribution-llama. The code explanation can be found on this page.
For Chinese version, see 知乎
First time open source, please point out any errors.

1. download model

For CN,点试试

2. clone code

git clone https://github.com/guoguo1314/llama3_learn.c.git
cd llama3_learn.c

3. convert tokenizer

python convert-tokenizer-llama3.py tokenizer.model

4. convert model

python convert-llama3.py Meta-Llama-3-8B/original llama3_8.bin

5. run

make run
./run llama3_8.bin

6. appreciate

Finally, thank you karpathy, b4rtaz open source, I don't know if you can see.