Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

--configure not giving options for local models / Ollama #106

Open
sammcj opened this issue Aug 19, 2024 · 3 comments
Open

--configure not giving options for local models / Ollama #106

sammcj opened this issue Aug 19, 2024 · 3 comments

Comments

@sammcj
Copy link

sammcj commented Aug 19, 2024

Following the readme to configure local models --configure does not give an option to enter a local LLM endpoint / Ollama:

image image

Readme:

image

I also tried in the fancy new UI (very cool idea) and the only options there are also Claude and "Open"AI.

@ThunderSMP-Rules
Copy link

I have the same problem, did you find a fix?

@ThunderSMP-Rules
Copy link

You need to uncomment some of the code in models.ts and models.py, not sure why it's commented

@vakandi
Copy link

vakandi commented Oct 18, 2024

@ThunderSMP-Rules I just spent 1 hour trying to modify the code so handle local AI (ollama), using the UI and the Terminal UI, both without success, so trust me uncommenting the lines that takes care of ollama is far from enough unfortunately.
Did you find a solution yourself?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants