-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: remove repeated completion assignment in llm.py #5167
Conversation
@rbren Hi rb, can you have a look on this pr? |
openhands/llm/llm.py
Outdated
if self.is_function_calling_active(): | ||
logger.debug('LLM: model supports function calling') | ||
|
||
self._completion = partial( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This assignment is not duplicate, I don't think? If you feel it is, could you point out where is the other?
The logging lines are duplicate, and it's nice to clean them out, thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://github.com/All-Hands-AI/OpenHands/blob/main/openhands/llm/llm.py#L111
https://github.com/All-Hands-AI/OpenHands/blob/main/openhands/llm/llm.py#L132
this two lines do the same assignment, I dont know what the point is, maybe I should assign an issue first?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, you're right, this is so weird. I have a guess... the init model info call is necessary for the variables assigned in the partial, some wires must have got crossed because of that.
There's no need for an issue, sorry.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do you think about this hypothesis - could you please check the order, e.g. that we need initialize model info at the right time?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do you think about this hypothesis - could you please check the order, e.g. that we need initialize model info at the right time?
I checked the init phase in self.init_model_info(), saw the self.config.max_output_tokens is assigned a new value in init_model_info after assigned to partial. We need reorder the init_model_info and partial
BTW, I don't think partial would change any variable which is outside of partial code scope, so the first partial is unused actually.
We can ask @rbren for confirmation, he submitted this commit.
e35e430
to
399e271
Compare
@enyst I reordered init_model_info and partial, make self.max_output_token assigned before using it in partial. Besides, I wondering whether we need to assign a default number (i.e. 4096) to max_output_token in init_model_info() if it is none . |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you! This looks to me like a funky copy paste, thanks a lot for handling it!
That's a good question. At the time of setting the value in the dataclass (None) we don't know if liteLLM will give us something... but I do agree it's weird. I'd suggest it doesn't need to be part of this PR, but if you want to propose a PR changing that, we can look into it. |
Thanks a lot, I will send another PR to discuss parameter settings in LLM configure |
This duplicate code is introduced here: https://github.com/All-Hands-AI/OpenHands/pull/4722/files#diff-b46ec9dd4b319d75809520e34b2559fd2ce46ca4895709eae9312aa04a8fa9a, delete it
399e271
to
d1f8249
Compare
End-user friendly description of the problem this fixes or functionality that this introduces
Give a summary of what the PR does, explaining any non-trivial design decisions
This duplicate code is introduced here: https://github.com/All-Hands-AI/OpenHands/pull/4722/files#diff-b46ec9dd4b319d75809520e34b2559fd2ce46ca4895709eae9312aa04a8fa9a4L151, I don't know what its purpose is, so send a pr to delete.
Link of any specific issues this addresses