Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for local mode via Ollama #6

Merged
merged 7 commits into from
Nov 11, 2024
Merged

Add support for local mode via Ollama #6

merged 7 commits into from
Nov 11, 2024

Conversation

ajbozarth
Copy link
Collaborator

@ajbozarth ajbozarth commented Oct 8, 2024

Adds support for running against local models by supporting the OpenAI API in addition to the Qiskit Code Assistant service API.

This allows users to input any OpenAL compatible API URL, such as Ollama, instead of a Qiskit Code Assistant service URL and the server extension will detect which API is set and call the correct endpoints.

Fixes #9

@ajbozarth ajbozarth added the enhancement New feature or request label Oct 8, 2024
@ajbozarth ajbozarth self-assigned this Oct 8, 2024
@ajbozarth
Copy link
Collaborator Author

This initial draft is fully functional afaik, I did not update any documentation yet though. I would expect we should probably update the various markdown files as well as the jupyterlab setting description for the service URL.

Adds support for running against local models by supporting the Ollama
API in addition to the Qiskit Code Assistant service API.

This allows users to input an Ollama API URL instead of a Qiskit Code
Assistant service URL and the server extension will detect which API
is set and call the correct endpoints.
@ajbozarth
Copy link
Collaborator Author

@cbjuan @vabarbosa per our discussion today and Va's updates to the VSCode extension I've made the updates to switch to using the OpenAI API endpoints.

Afaik this is g2g based on the feature add, IIUC the other issues we discussed should be handled in separate feature PRs

@ajbozarth
Copy link
Collaborator Author

Afaik this is g2g based on the feature add

Still no doc updates though, I'll need some guidance on how we want to document this support

Copy link
Collaborator

@vabarbosa vabarbosa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just a couple minor comments as i took a quick glimpse. will be downloading and running it locally later today

qiskit_code_assistant_jupyterlab/handlers.py Outdated Show resolved Hide resolved
qiskit_code_assistant_jupyterlab/handlers.py Outdated Show resolved Hide resolved
Copy link
Collaborator

@vabarbosa vabarbosa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i was able to successfully run this and switch between a local Ollama deployment and the remote code assistant service 👍🏽
thank you!

@ajbozarth
Copy link
Collaborator Author

@cbjuan I just pushed the doc updates for local mode if you want to take a look at the language and share feedback. I like how I updated the requirements section, but I'm on the fence about my wording for the jupyterlab settings section. IIUC these are the only two doc updates needed according to you comments in today's call.

Copy link
Member

@cbjuan cbjuan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me. So far just a minor comment about the minimal docs (not a blocker)

Thanks

README.md Outdated Show resolved Hide resolved
Copy link
Member

@cbjuan cbjuan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor edit proposal

README.md Outdated Show resolved Hide resolved
README.md Outdated Show resolved Hide resolved
@ajbozarth ajbozarth merged commit 8b7d3a7 into main Nov 11, 2024
6 checks passed
@ajbozarth ajbozarth deleted the ollama branch November 11, 2024 23:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Enable the extension to work in local mode
3 participants