Skip to content

Enhanced Model Management and SystemPrompt Integration in OllamaAgent #351

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 10 commits into
base: main
Choose a base branch
from

Conversation

cnupy
Copy link

@cnupy cnupy commented Mar 5, 2025

PR Summary

Feature 1: Support for Multiple Models

Description

OllamaAgent can now handle and switch between multiple models seamlessly, listing available models dynamically via an API

Implementation Details

  • Modified the agent to include a model selection mechanism that allows it to iterate through predefined models.
  • Introduced a model command to manage the switching between models based on available models.

Feature 2: Integration of SystemPrompt

Description

It is possible to configure SystemPrompt with the selected model into OllamaAgent, providing users with a toolkit for task-oriented conversations.

Implementation Details

  • Modified the agent to accommodate Ollama's SystemPrompt functionality.
  • Introduced a system-prompt command to manage the setting.

Feature 3: Support for Predefined ModelConfigs sets

Description

OllamaAgent now supports the use of predefined model configurations, allowing users to easily switch between different models and system prompts based on specific requirements.

Implementation Details

  • Introduced a ModelConfig Record to encapsulate data for each configuration.
  • Introduced a config command to manage the switching between configuration based on available predefined sets.

PR Context

This Pull Request enhances the functionality of OllamaAgent by introducing improved model management. By allowing seamless switching between multiple models and supporting predefined configurations, it offers users greater flexibility. The addition of SystemPrompt functionality empowers users with a more task-oriented conversational toolkit, streamlining interactions. These updates not only enhance user experience but also improve the agent's adaptability to diverse requirements. This makes OllamaAgent a more versatile and powerful tool overall.

…sly, listing available models dynamically via an API

*   Modified the agent to include a model selection mechanism that allows it to iterate through predefined models.
*   Introduced a `model` command to manage the switching between models based on available models.

It is possible to configure SystemPrompt with the selected model into OllamaAgent, providing users with a toolkit for task-oriented conversations.

*   Modified the agent to accommodate Ollama's SystemPrompt functionality.
*   Introduced a `system-prompt` command to manage the setting.

OllamaAgent now supports the use of predefined model configurations, allowing users to easily switch between different models and system prompts based on specific requirements.

*   Introduced a `ModelConfig` Record to encapsulate data for each configuration.
*   Introduced a `config` command to manage the switching between configuration based on available predefined sets.
@cnupy

This comment was marked as resolved.

@daxian-dbw
Copy link
Member

@kborowinski @cnupy After upgrading to the v0.5.13 of Ollama, the OllamaAgent stops working even from the main branch ... After sending a query, it's just spinning forever, and after cancel the request, the server side displays the following record:

image

I moved to OllamaSharp v5.1.4, but it's the same result. Any idea what could be the problem?

@kborowinski
Copy link
Contributor

kborowinski commented Mar 11, 2025

@daxian-dbw I'm on latest AIShell build from main and Ollama v0.5.13 with llama3.1:8b-instruct-q5_K_M and it works. What model are you running? Has anything changed on your computer where you run AIShell like drivers? How your config looks like?

Animation

And this is different computer, graphic card and model phi4-mini:3.8b-q4_K_M, and it works as well:

Animation

Try to stop all Ollama processes and then start it again:

Get-Process ollama* | Stop-Process -Force

@cnupy
Copy link
Author

cnupy commented Mar 11, 2025

Everything's working fine for me as well. I'm using Ollama 0.5.4

/llm/ollama# ./ollama -v
ggml_sycl_init: found 1 SYCL devices:
ollama version is 0.5.4-ipexllm-20250310

Copy link
Member

@daxian-dbw daxian-dbw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for the delay!
The ollama agent from the main branch works for me now (I have no idea why it didn't weeks before. I guess something specific to my system). I left some comments, but haven't finished reviewing the Command.cs file. Will finish it tomorrow.

Copy link
Member

@daxian-dbw daxian-dbw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done with my 2nd pass of review :)

@daxian-dbw
Copy link
Member

daxian-dbw commented Apr 2, 2025

@cnupy Thanks for the quick updates. Now only those 4 remaining comments are relevant (I left follow-up comments). All rest looks good.

@cnupy
Copy link
Author

cnupy commented Apr 5, 2025

@cnupy Thanks for the quick updates. Now only those 4 remaining comments are relevant (I left follow-up comments). All rest looks good.

I've resolved the four remaining comments. Additionally, I renamed the Configs property to Presets for better clarity. I've also carried out some additional refactoring to properly implement the running model checking. Furthermore, I moved and optimized PerformSelfcheck from OllamaAgent to Settings so that it can be utilized in both locations. Let me know if there's anything else you'd like adjusted.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants