How to log whole request which is going to be sent to openAI? #2331
Answered
by
jackgerrits
WebsheetPlugin
asked this question in
Q&A
-
I have curently some weird 500 error responses from OpenAI. Therefore, I wanted to see what exactly Autogen sends to OpenAI in the request. Would anyone be able to help me with some instructions on how to do this? I use: self.user_proxy.initiate_chat( |
Beta Was this translation helpful? Give feedback.
Answered by
jackgerrits
Apr 9, 2024
Replies: 1 comment 1 reply
-
Check this out for runtime logging: https://microsoft.github.io/autogen/docs/notebooks/agentchat_logging |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
WebsheetPlugin
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Check this out for runtime logging: https://microsoft.github.io/autogen/docs/notebooks/agentchat_logging