-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for local mode via Ollama #6
Conversation
This initial draft is fully functional afaik, I did not update any documentation yet though. I would expect we should probably update the various markdown files as well as the jupyterlab setting description for the service URL. |
Adds support for running against local models by supporting the Ollama API in addition to the Qiskit Code Assistant service API. This allows users to input an Ollama API URL instead of a Qiskit Code Assistant service URL and the server extension will detect which API is set and call the correct endpoints.
@cbjuan @vabarbosa per our discussion today and Va's updates to the VSCode extension I've made the updates to switch to using the OpenAI API endpoints. Afaik this is g2g based on the feature add, IIUC the other issues we discussed should be handled in separate feature PRs |
Still no doc updates though, I'll need some guidance on how we want to document this support |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just a couple minor comments as i took a quick glimpse. will be downloading and running it locally later today
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i was able to successfully run this and switch between a local Ollama deployment and the remote code assistant service 👍🏽
thank you!
@cbjuan I just pushed the doc updates for local mode if you want to take a look at the language and share feedback. I like how I updated the requirements section, but I'm on the fence about my wording for the jupyterlab settings section. IIUC these are the only two doc updates needed according to you comments in today's call. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me. So far just a minor comment about the minimal docs (not a blocker)
Thanks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor edit proposal
Adds support for running against local models by supporting the OpenAI API in addition to the Qiskit Code Assistant service API.
This allows users to input any OpenAL compatible API URL, such as Ollama, instead of a Qiskit Code Assistant service URL and the server extension will detect which API is set and call the correct endpoints.
Fixes #9