You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<|im_start|>system
你是一个设计用于与 SQL 数据库交互的智能代理。
# Tools
You may call one or more functions to assist with the user query.
You are provided with function signatures within <tools></tools> XML tags:
<tools>
{"type": "function", "function": {"name": "sql_db_list_tables", "description": "输入为空字符串,输出为数据库中表名的逗号分隔列表。", "parameters": {"properties": {"tool_input": {"default": "", "description": "An empty string", "type": "string"}}, "type": "object"}}}</tools>
For each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:
<tool_call>
{"name": <function-name>, "arguments": <args-json-object>}
</tool_call><|im_end|>
<|im_start|>user
XXXX<|im_end|>
<|im_start|>assistant
generated tool call (extracted from response text stream)
I found that the model generated the correct tokens, which is {"name": "sql_db_list_tables", "arguments": {}}.
this issue is related to vllm: vllm-project/vllm#11392
This issue has been automatically marked as inactive due to lack of recent activity. Should you believe it remains unresolved and warrants attention, kindly leave a comment on this thread.
Model Series
Qwen2.5
What are the models used?
Qwen2.5-7B-Instruct
What is the scenario where the problem happened?
deployment with
vllm
, tool calling with stream=TrueIs this a known issue?
Information about environment
OS: Ubuntu 20.04
Python: Python 3.11.5
GPUs: 2 x NVIDIA A20
NVIDIA driver: 525.85.12
CUDA compiler: 12.0
PyTorch: 2.5.1+cu124
Log output
Description
Steps to reproduce
http request payload
generated prompt
generated tool call (extracted from response text stream)
Expected results (tool call portion)
Per tool definition, the result should be:
With stream=False, the model will return:
the response is OK to make use of, but not exactly as definition.
The text was updated successfully, but these errors were encountered: