You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@ThunderSMP-Rules I just spent 1 hour trying to modify the code so handle local AI (ollama), using the UI and the Terminal UI, both without success, so trust me uncommenting the lines that takes care of ollama is far from enough unfortunately.
Did you find a solution yourself?
Following the readme to configure local models --configure does not give an option to enter a local LLM endpoint / Ollama:
Readme:
I also tried in the fancy new UI (very cool idea) and the only options there are also Claude and "Open"AI.
The text was updated successfully, but these errors were encountered: