Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]cant use proxy url for openais models,deepseek is the same error #2063

Open
moshilangzi opened this issue Feb 10, 2025 · 1 comment
Open
Labels
bug Something isn't working

Comments

@moshilangzi
Copy link

Description

[Bug]cant use proxy url for openais models,deepseek is the same error

deepseek-r1 code:
`from agno.agent import Agent
from agno.models.deepseek import DeepSeek

def main():
# 配置DeepSeek模型
model = DeepSeek(
id="deepseek-chat", # 使用deepseek-chat模型
base_url="https://api.siliconflow.cn/v1/chat/completions",
api_key="sk-ipzuesonubpoaepzb........", # 替换为你的API密钥
)

# 创建Agent
agent = Agent(
    model=model,
    description="I am a helpful AI assistant powered by DeepSeek.",
    markdown=True,  # 启用markdown格式输出
)

# 测试对话
agent.print_response("你好!请介绍一下你自己。")

if name == "main":
main()`

errors:

Traceback (most recent call last):
  File "/Users/XXX/project/shengshi/opensource/agno/examples/local_r1.py", line 23, in <module>
    main()
    ~~~~^^
  File "/Users/XXX/project/shengshi/opensource/agno/examples/local_r1.py", line 20, in main
    agent.print_response("你好!请介绍一下你自己。")
    ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/agent/agent.py", line 3381, in print_response
    run_response = self.run(
        message=message,
    ...<5 lines>...
        **kwargs,
    )
  File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/agent/agent.py", line 869, in run
    return next(resp)
  File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/agent/agent.py", line 592, in _run
    model_response = self.model.response(messages=run_messages.messages)
  File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/models/openai/chat.py", line 484, in response
    response: Union[ChatCompletion, ParsedChatCompletion] = self.invoke(messages=messages)
                                                            ~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^
  File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/models/openai/chat.py", line 295, in invoke
    return self.get_client().chat.completions.create(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        model=self.id,
        ^^^^^^^^^^^^^^
        messages=[self.format_message(m) for m in messages],  # type: ignore
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        **self.request_kwargs,
        ^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/_utils/_utils.py", line 279, in wrapper
    return func(*args, **kwargs)
  File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/resources/chat/completions.py", line 863, in create
    return self._post(
           ~~~~~~~~~~^
        "/chat/completions",
        ^^^^^^^^^^^^^^^^^^^^
    ...<40 lines>...
        stream_cls=Stream[ChatCompletionChunk],
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/_base_client.py", line 1283, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/_base_client.py", line 960, in request
    return self._request(
           ~~~~~~~~~~~~~^
        cast_to=cast_to,
        ^^^^^^^^^^^^^^^^
    ...<3 lines>...
        retries_taken=retries_taken,
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/_base_client.py", line 1064, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: 404 page not found

for openai`s proxy url:

error:
/Users/XXX/miniforge3/envs/agno/bin/python /Users/shiyalun/project/shengshi/opensource/agno/cookbook/models/openai/basic.py ▰▰▰▰▰▱▱ Thinking... Traceback (most recent call last): File "/Users/XXX/project/shengshi/opensource/agno/cookbook/models/openai/basic.py", line 11, in <module> agent.print_response("Share a 2 sentence horror story") ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/agent/agent.py", line 3381, in print_response run_response = self.run( message=message, ...<5 lines>... **kwargs, ) File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/agent/agent.py", line 869, in run return next(resp) File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/agent/agent.py", line 592, in _run model_response = self.model.response(messages=run_messages.messages) File "/Users/shiyalun/miniforge3/envs/agno/lib/python3.13/site-packages/agno/models/openai/chat.py", line 484, in response response: Union[ChatCompletion, ParsedChatCompletion] = self.invoke(messages=messages) ~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/agno/models/openai/chat.py", line 295, in invoke return self.get_client().chat.completions.create( ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^ model=self.id, ^^^^^^^^^^^^^^ messages=[self.format_message(m) for m in messages], # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ **self.request_kwargs, ^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/_utils/_utils.py", line 279, in wrapper return func(*args, **kwargs) File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/resources/chat/completions.py", line 863, in create return self._post( ~~~~~~~~~~^ "/chat/completions", ^^^^^^^^^^^^^^^^^^^^ ...<40 lines>... stream_cls=Stream[ChatCompletionChunk], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/_base_client.py", line 1283, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/_base_client.py", line 960, in request return self._request( ~~~~~~~~~~~~~^ cast_to=cast_to, ^^^^^^^^^^^^^^^^ ...<3 lines>... retries_taken=retries_taken, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/Users/XXX/miniforge3/envs/agno/lib/python3.13/site-packages/openai/_base_client.py", line 1064, in _request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'developer' with this model. (request id: 20250210113713924624709jwGzahet) (request id: 20250210113713922472260p6XKvJvC) (request id: )", 'type': 'invalid_request_error'}}

@moshilangzi moshilangzi added the bug Something isn't working label Feb 10, 2025
@pritipsingh
Copy link
Contributor

Hello @moshilangzi,

I hope you're doing well. sorry about that, but it seems there are two issues here:

1)It appears that the URL being passed may not be correct. Could you kindly verify it?

2)To access the models, you can use the OpenAILike model by replacing the base_url.
For more details on the OpenAILike model, please refer to this documentation:

https://docs.agno.com/models/openai-like

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants