Skip to content

Commit

Permalink
feat: Fix VLLM tool_choice (#1001)
Browse files Browse the repository at this point in the history
  • Loading branch information
mattzh72 authored Feb 14, 2025
1 parent b353fee commit 745c78a
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion letta/llm_api/llm_api_tools.py
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,8 @@ def create(
if function_call is None and functions is not None and len(functions) > 0:
# force function calling for reliability, see https://platform.openai.com/docs/api-reference/chat/create#chat-create-tool_choice
# TODO(matt) move into LLMConfig
if llm_config.model_endpoint == "https://inference.memgpt.ai":
# TODO: This vllm checking is very brittle and is a patch at most
if llm_config.model_endpoint == "https://inference.memgpt.ai" or (llm_config.handle and "vllm" in llm_config.handle):
function_call = "auto" # TODO change to "required" once proxy supports it
else:
function_call = "required"
Expand Down

0 comments on commit 745c78a

Please sign in to comment.