Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

generator: Ollama #876

Merged
merged 12 commits into from
Sep 30, 2024
Merged

generator: Ollama #876

merged 12 commits into from
Sep 30, 2024

Conversation

martinebl
Copy link
Contributor

Add two Ollama generators, using either the chat or generate functions from the Ollama package.
Add tests, that are skipped when no Ollama server is running.

The tests are skipped when no Ollama server can be found.
Copy link
Contributor

github-actions bot commented Sep 3, 2024

DCO Assistant Lite bot All contributors have signed the DCO ✍️ ✅

@martinebl
Copy link
Contributor Author

I have read the DCO Document and I hereby sign the DCO

@martinebl
Copy link
Contributor Author

recheck

github-actions bot added a commit that referenced this pull request Sep 3, 2024
@leondz leondz added generators Interfaces with LLMs new plugin Describes an entirely new probe, detector, generator or harness labels Sep 4, 2024
@leondz
Copy link
Owner

leondz commented Sep 4, 2024

Thanks for this! Will take a look

Copy link
Collaborator

@jmartin-tech jmartin-tech left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks pretty good, this should work for a local service and the change to access instance parameters will enable config.

Tests need a bit of work to have proper value.

garak/generators/ollama.py Outdated Show resolved Hide resolved
garak/generators/ollama.py Outdated Show resolved Hide resolved
garak/generators/ollama.py Outdated Show resolved Hide resolved
garak/generators/ollama.py Outdated Show resolved Hide resolved
tests/generators/test_ollama.py Outdated Show resolved Hide resolved
@leondz leondz linked an issue Sep 13, 2024 that may be closed by this pull request
@leondz leondz changed the title Ollama generator generator: Ollama Sep 24, 2024
@jmartin-tech jmartin-tech merged commit aadc9aa into leondz:main Sep 30, 2024
8 checks passed
@github-actions github-actions bot locked and limited conversation to collaborators Sep 30, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
generators Interfaces with LLMs new plugin Describes an entirely new probe, detector, generator or harness
Projects
None yet
Development

Successfully merging this pull request may close these issues.

generator: ollama
3 participants