Support running a local model #60
MrDowntempo
started this conversation in
Ideas
Replies: 1 comment
-
This can be tricky due to the current codebase of only supporting online APIs. It would need a huge refactoring. It might take a while... but it will be on the future plans. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Could be done with a Llamacpp instance running a GGUF file the user downloads. There's a lot of tiny models that a phone can run. This would work even when the Internet is out.
Beta Was this translation helpful? Give feedback.
All reactions