-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding Ollama to Autogen / Magentic One #4333
Conversation
@afourney @ekzhu @jackgerrits @husseinmozannar Recreated the Pull requested as per requested. Here is the original Pull request for reference #4280 |
top_p: float = 0.95 | ||
|
||
class OllamaChatCompletionClient(ChatCompletionClient): | ||
def __init__( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doc string like other clients
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, instead of using OllamaConfig, can we use keyword arguments like other clients?
messages: List, | ||
*, | ||
response_format: Optional[Dict[str, str]] = None, | ||
stream: bool = False, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The create
method in the protocol has the following arguments:
async def create(
self,
messages: Sequence[LLMMessage],
tools: Sequence[Tool | ToolSchema] = [],
# None means do not override the default
# A value means to override the client default - often specified in the constructor
json_output: Optional[bool] = None,
extra_create_args: Mapping[str, Any] = {},
cancellation_token: Optional[CancellationToken] = None,
) -> CreateResult: ...
This method should implement the protocol method, rather than introducing its own arguments.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The return value must be CreateResult
json_output=True, | ||
) | ||
|
||
def extract_role_and_content(self, msg) -> tuple[str, Union[str, List[Union[str, Image]]]]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's use private method prefix with _
else: | ||
return 'user', str(msg) | ||
|
||
def process_message_content(self, content): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
make private
text_parts.append(str(item)) | ||
return '\n'.join(text_parts), images | ||
|
||
def encode_image(self, image: Image) -> Optional[str]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
make priviate
return AssistantMessage( | ||
content=f"Error: Failed to get response from Ollama server: {str(e)}", | ||
source='assistant' | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing the following methods:
def actual_usage(self) -> RequestUsage: ...
def total_usage(self) -> RequestUsage: ...
def count_tokens(self, messages: Sequence[LLMMessage], tools: Sequence[Tool | ToolSchema] = []) -> int: ...
def remaining_tokens(self, messages: Sequence[LLMMessage], tools: Sequence[Tool | ToolSchema] = []) -> int: ...
@property
def capabilities(self) -> ModelCapabilities: ...
) | ||
self.kwargs = kwargs | ||
|
||
self.model_capabilities = ModelCapabilities( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
model capabilities need to be set from input.
d186a41
What is happening? Why are there no commits and why was this PR closed? |
i don't understand I am sure i did not but logs here says different |
@MervinPraison I think you might want to create a new branch on your repo, and create a PR from there. BTW can you also address the comments first? |
But how did this one got merged by me, this is the one I merged : #4329 |
I think the issue is that @MervinPraison needs to create a branch first on their fork of autogen, then create a PR comparing that branch to microsoft:main. It must be some weird behavior from GitHub that got this closed @colombod |
Why are these changes needed?
Related Issue Number
Checks