[Bug]cant use proxy url for openai
s models,deepseek is the same error
#2063
Labels
bug
Something isn't working
Description
[Bug]can
t use proxy url for openai
s models,deepseek is the same errordeepseek-r1 code:
`from agno.agent import Agent
from agno.models.deepseek import DeepSeek
def main():
# 配置DeepSeek模型
model = DeepSeek(
id="deepseek-chat", # 使用deepseek-chat模型
base_url="https://api.siliconflow.cn/v1/chat/completions",
api_key="sk-ipzuesonubpoaepzb........", # 替换为你的API密钥
)
if name == "main":
main()`
errors:
for openai`s proxy url:
error:
/Users/XXX/miniforge3/envs/agno/bin/python /Users/shiyalun/project/shengshi/opensource/agno/cookbook/models/openai/basic.py ▰▰▰▰▰▱▱ Thinking... Traceback (most recent call last): File "/Users/XXX/project/shengshi/opensource/agno/cookbook/models/openai/basic.py", line 11, in <module> agent.print_response("Share a 2 sentence horror story") ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/agent/agent.py", line 3381, in print_response run_response = self.run( message=message, ...<5 lines>... **kwargs, ) File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/agent/agent.py", line 869, in run return next(resp) File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/agent/agent.py", line 592, in _run model_response = self.model.response(messages=run_messages.messages) File "/Users/shiyalun/miniforge3/envs/agno/lib/python3.13/site-packages/agno/models/openai/chat.py", line 484, in response response: Union[ChatCompletion, ParsedChatCompletion] = self.invoke(messages=messages) ~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/models/openai/chat.py", line 295, in invoke return self.get_client().chat.completions.create( ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^ model=self.id, ^^^^^^^^^^^^^^ messages=[self.format_message(m) for m in messages], # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ **self.request_kwargs, ^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/_utils/_utils.py", line 279, in wrapper return func(*args, **kwargs) File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/resources/chat/completions.py", line 863, in create return self._post( ~~~~~~~~~~^ "/chat/completions", ^^^^^^^^^^^^^^^^^^^^ ...<40 lines>... stream_cls=Stream[ChatCompletionChunk], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/_base_client.py", line 1283, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/_base_client.py", line 960, in request return self._request( ~~~~~~~~~~~~~^ cast_to=cast_to, ^^^^^^^^^^^^^^^^ ...<3 lines>... retries_taken=retries_taken, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/_base_client.py", line 1064, in _request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'developer' with this model. (request id: 20250210113713924624709jwGzahet) (request id: 20250210113713922472260p6XKvJvC) (request id: )", 'type': 'invalid_request_error'}}
The text was updated successfully, but these errors were encountered: