Replies: 3 comments 1 reply
-
Congrats! |
Beta Was this translation helpful? Give feedback.
-
How quick is it generating tokens for you |
Beta Was this translation helpful? Give feedback.
-
I was expecting it to have 32k token length not just 8k. I assume the usual: Content Generation |
Beta Was this translation helpful? Give feedback.
-
I have successfully deployed the Grok model and performed tests on several cases. Given that it is a pre-trained model without instruction tuning, its conversational capabilities are somewhat limited. Beyond that, what other applications could it be used for?
Beta Was this translation helpful? Give feedback.
All reactions