Skip to content

Commit

Permalink
response.text_or_raise() workaround
Browse files Browse the repository at this point in the history
Closes #632
  • Loading branch information
simonw committed Nov 14, 2024
1 parent 3b6e734 commit cf172cc
Show file tree
Hide file tree
Showing 3 changed files with 12 additions and 2 deletions.
5 changes: 4 additions & 1 deletion docs/plugins/advanced-model-plugins.md
Original file line number Diff line number Diff line change
Expand Up @@ -162,5 +162,8 @@ for prev_response in conversation.responses:
messages.append(
{"role": "user", "content": prev_response.prompt.prompt}
)
messages.append({"role": "assistant", "content": prev_response.text()})
messages.append({"role": "assistant", "content": prev_response.text_or_raise()})
```
The `response.text_or_raise()` method used there will return the text from the response or raise a `ValueError` exception if the response is an `AsyncResponse` instance that has not yet been fully resolved.

This is a slightly weird hack to work around the common need to share logic for building up the `messages` list across both sync and async models.
4 changes: 3 additions & 1 deletion llm/default_plugins/openai_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -375,7 +375,9 @@ def build_messages(self, prompt, conversation):
messages.append(
{"role": "user", "content": prev_response.prompt.prompt}
)
messages.append({"role": "assistant", "content": prev_response.text()})
messages.append(
{"role": "assistant", "content": prev_response.text_or_raise()}
)
if prompt.system and prompt.system != current_system:
messages.append({"role": "system", "content": prompt.system})
if not prompt.attachments:
Expand Down
5 changes: 5 additions & 0 deletions llm/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -399,6 +399,11 @@ async def _force(self):
pass
return self

def text_or_raise(self) -> str:
if not self._done:
raise ValueError("Response not yet awaited")
return "".join(self._chunks)

async def text(self) -> str:
await self._force()
return "".join(self._chunks)
Expand Down

0 comments on commit cf172cc

Please sign in to comment.