-
Notifications
You must be signed in to change notification settings - Fork 449
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use Ollama API to start the right model #89
Comments
When the user changes the Ollama model, we should communicate with the Ollama API to actually change the model |
For now, we just let the user pick a model through Ollama (and removed the model selection from Void) |
@andrewpareles can i look into this |
Definitely! Just submit a PR when you're done :) |
Can you please clarify the requirements please, should we use the ollama api to get what are models that are installed and only display those models to select from |
I think starting with launching Ollama and show installed model and select them will be ok. Installing ollama local model from void and selecting them is the absolute goal I think. |
Hey, made a pull request #140 please review |
This issue is still open, because ideally we'd let users pick a model straight from inside Void (instead of just performing a check to make sure they have the model). If anyone wants to look into this, feel free! |
No description provided.
The text was updated successfully, but these errors were encountered: