Avoid calling/sending data to LLM with native plugin and function calling #6581
-
We are using the SK version 1.13 and Azure Open AI. We have 2 native plugins (P1 and P2) with one function/method each (M1 and M2). P1M1 gets called thru function calling and gets data from internal rest service which is json data. After P1M1 gets called thru function calling, we need to "avoid" sending data to LLM but not prevent calls to P2M2 if orchestrated as such. Is there a way to avoid sending data to LLM yet continue successfully with function calling and other calls if deemed necessary(considering that kernel chat-history context is based on tool and assistant data prior to the LLM call). This can apply to any number of chained function calls. We just want to avoid sending data to LLM based on our discretion |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 10 replies
-
@EVENFLOW212 Could you please try to register Basically, your plugins/functions will be invoked, but function result won't be sent to LLM, but instead it will be returned to the place where you started the execution. Please let us know if this helps to resolve your scenario. Thanks! |
Beta Was this translation helpful? Give feedback.
@EVENFLOW212 In this case, you send request to LLM and ask which functions to call, get a result with function list, execute it in order manually, and that should be it. Without initial request to LLM, you won't know which functions to call, so this is required.
With manual function calling, after P1M1 gets called, there should not be any requests to LLM in between, unless your P1M1 function performs request to LLM on its own.