Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama AI #1299

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Ollama AI #1299

wants to merge 1 commit into from

Conversation

github-actions[bot]
Copy link
Contributor

Description

This extension adds the functionality to your project to easily send requests to the "Ollama" AI, and get responses from it.

How to host your own server

  1. Go to https://ollama.com/download
  2. Choose your platform
  3. Install the way proper to your operating system
  4. Open your command prompt and type in the following: ollama pull llama3 (Note: Although llama3 is the newest version and I recommend you to use this one, you can select any other model offered on the Ollama website, just switch the llama3 to any other model name they listed there: https://ollama.com/library)
  5. After the installation is done, you can start running the server and use the model with ollama run llama3 (Note: Again, you can use any model name here that you have installed instead of llama3)

If you are stuck, NetworkChuck has a really cool video explaining everything you have to know for hosting your own server, I also followed this video to get mine working: https://youtu.be/Wjrdr0NU4Sk?t=182

How to customize models

You can read the official documentation for this on the official Ollama GitHub repo: https://github.com/ollama/ollama?tab=readme-ov-file#customize-a-prompt

How to use the extension

Create a simple action to send the following data to a Ollama AI server:

  • URL (The server's URL with port)
  • Model (The model you want to generate the response)
  • Prompt (The prompt you send to the server to reply to)

Here is an example action I also used in the example project

image

Caution

The speed of the response generation depends on the hardware the server is hosted on.

Checklist

  • I've followed all of the best practices.
  • I confirm that this extension can be integrated to this GitHub repository, distributed and MIT licensed.
  • I am aware that the extension may be updated by anyone, and do not need my explicit consent to do so.

What tier of review do you aim for your extension?

Reviewed

Example file

OllamaAIExample.zip

Extension file

OllamaAI.zip

@github-actions github-actions bot added the ✨ New extension A new extension label May 28, 2024
@github-actions github-actions bot requested a review from a team as a code owner May 28, 2024 01:58
@github-actions github-actions bot mentioned this pull request May 28, 2024
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
✨ New extension A new extension
Projects
Status: Needs review
Development

Successfully merging this pull request may close these issues.

1 participant