Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use Ollama API to start the right model #89

Closed
jcommaret opened this issue Oct 5, 2024 · 9 comments
Closed

Use Ollama API to start the right model #89

jcommaret opened this issue Oct 5, 2024 · 9 comments
Labels
good first issue Good for newcomers

Comments

@jcommaret
Copy link
Contributor

jcommaret commented Oct 5, 2024

No description provided.

@jcommaret jcommaret changed the title chore : feat : start Ollama when the user select is model. Oct 5, 2024
@andrewpareles andrewpareles added the good first issue Good for newcomers label Oct 15, 2024
@jcommaret jcommaret changed the title feat : start Ollama when the user select is model. feat : start Ollama when the user select this model. Oct 15, 2024
@andrewpareles andrewpareles changed the title feat : start Ollama when the user select this model. Use Ollama API to start the right model Oct 18, 2024
@andrewpareles
Copy link
Contributor

When the user changes the Ollama model, we should communicate with the Ollama API to actually change the model

@andrewpareles
Copy link
Contributor

For now, we just let the user pick a model through Ollama (and removed the model selection from Void)

@andrewpareles andrewpareles added the backlog We might come back to this later label Oct 23, 2024
@PRANJALRANA11
Copy link

@andrewpareles can i look into this

@andrewpareles
Copy link
Contributor

@andrewpareles can i look into this

Definitely! Just submit a PR when you're done :)

@andrewpareles andrewpareles removed the backlog We might come back to this later label Oct 28, 2024
@Madhav160804
Copy link
Contributor

Can you please clarify the requirements please, should we use the ollama api to get what are models that are installed and only display those models to select from
OR
when user selects a model which is not present on the system so just better error handling to display that the model doesn't exist ?

@jcommaret
Copy link
Contributor Author

jcommaret commented Nov 4, 2024

I think starting with launching Ollama and show installed model and select them will be ok.

Installing ollama local model from void and selecting them is the absolute goal I think.

@Madhav160804
Copy link
Contributor

Hey, made a pull request #140 please review

@andrewpareles
Copy link
Contributor

This issue is still open, because ideally we'd let users pick a model straight from inside Void (instead of just performing a check to make sure they have the model). If anyone wants to look into this, feel free!

@andrewpareles
Copy link
Contributor

#172 is the updated version of this.

We might want to let users pull models - that's now part of #165.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

4 participants