Meta releasing a new llama model soon #2226
Replies: 4 comments 3 replies
-
llama2 is out. Is support planned ? |
Beta Was this translation helpful? Give feedback.
-
@TheBloke already does provide ggml-versions of the 7b and 13b llama-2 variants. There is also a new 70b-variant, close to what 65b was, but nothing near the former 33b. We will see if 70b requires further work until these work in llama.cpp or if it just takes more time to convert them... The context size of these new models is 4k from start, without any tricks. |
Beta Was this translation helpful? Give feedback.
-
Yep, llama-2-7b-chat.ggmlv3.q4_0.bin works for me! |
Beta Was this translation helpful? Give feedback.
-
Here's a detailed writeup from TheBloke's community discord if it helps clarify anything. |
Beta Was this translation helpful? Give feedback.
-
Some reports around that Meta is releasing a new & improved llama model soon. It will be "available for commercial use".
"The competitive landscape of AI is going to completely change in the coming months, in the coming weeks maybe, when there will be open source platforms that are actually as good as the ones that are not [opensource],” vice-president and chief AI scientist at Meta, Yann LeCun, said at a conference in Aix-en-Provence last Saturday.
Beta Was this translation helpful? Give feedback.
All reactions