|
| 1 | +# Python Function Calling |
| 2 | + |
| 3 | +In this article we'll cover how to call Python functions using `func-ai`. |
| 4 | + |
| 5 | +## Pre-requisites |
| 6 | + |
| 7 | +Before you begin make sure you have the following: |
| 8 | + |
| 9 | +- `func-ai` installed (`pip install func-ai`) |
| 10 | +- An OpenAI API key set in the `OPENAI_API_KEY` environment variable (you can have a `.env` file in the current |
| 11 | + directory with `OPENAI_API_KEY=<your-api-key>` and then use load_dotenv() to load the environment variables from the |
| 12 | + file) |
| 13 | + |
| 14 | +## Calling Python functions using OpenAI API |
| 15 | + |
| 16 | +First let's define a python function we want to call using LLM: |
| 17 | + |
| 18 | +```python |
| 19 | +def add_two_numbers(a: int, b: int) -> int: |
| 20 | + """ |
| 21 | + Adds two numbers |
| 22 | + |
| 23 | + :param a: The first number |
| 24 | + :param b: The second number |
| 25 | + :return: The sum of the two numbers |
| 26 | + """ |
| 27 | + return a + b |
| 28 | +``` |
| 29 | + |
| 30 | +A few key points about how functions we want to expose to LLMs should be defined: |
| 31 | + |
| 32 | +- The function MUST have type-hints for all parameters and the return value. This helps LLMs understand what the |
| 33 | + function does and how to call it. |
| 34 | +- The function MUST have a docstring. The docstring and in particular the description is used by the LLM to identify the |
| 35 | + function to call. |
| 36 | +- The function docstring MUST contain parameters and their descriptions. This helps LLMs understand what parameters the |
| 37 | + function takes and what they are used for. |
| 38 | + |
| 39 | +Now let's convert the above function so that it can be called using OpenAI function calling capability: |
| 40 | + |
| 41 | +```python |
| 42 | +from func_ai.utils.py_function_parser import func_to_json |
| 43 | + |
| 44 | +_json_fun = func_to_json(add_two_numbers) |
| 45 | +``` |
| 46 | + |
| 47 | +In the above snippet we use `func_to_json` to convert the python function to a dictionary that can be passed to OpenAI |
| 48 | +API. |
| 49 | + |
| 50 | +Now let's do some prompting to see how the function can be called: |
| 51 | + |
| 52 | +```python |
| 53 | +import openai |
| 54 | +import json |
| 55 | +from dotenv import load_dotenv |
| 56 | + |
| 57 | +load_dotenv() |
| 58 | + |
| 59 | + |
| 60 | +def call_openai(_messages, _functions: list = None): |
| 61 | + if _functions: |
| 62 | + _open_ai_resp = openai.ChatCompletion.create( |
| 63 | + model="gpt-3.5-turbo-0613", |
| 64 | + messages=_messages, |
| 65 | + functions=_functions, |
| 66 | + function_call="auto", |
| 67 | + temperature=0.0, |
| 68 | + top_p=1.0, |
| 69 | + frequency_penalty=0.0, |
| 70 | + presence_penalty=0.0, |
| 71 | + max_tokens=256, |
| 72 | + ) |
| 73 | + else: |
| 74 | + _open_ai_resp = openai.ChatCompletion.create( |
| 75 | + model="gpt-3.5-turbo-0613", |
| 76 | + messages=_messages, |
| 77 | + temperature=0.5, |
| 78 | + top_p=1.0, |
| 79 | + frequency_penalty=0.0, |
| 80 | + presence_penalty=0.0, |
| 81 | + max_tokens=256, |
| 82 | + ) |
| 83 | + return _open_ai_resp["choices"][0]["message"] |
| 84 | + |
| 85 | + |
| 86 | +_messages = [{"role": "system", |
| 87 | + "content": "You are a helpful automation system that helps users to perform a variety of supported tasks."}, |
| 88 | + {"role": "user", "content": "I want to add 5 and 10"}] |
| 89 | +_functions = [_json_fun] |
| 90 | +response = call_openai(_messages, _functions) |
| 91 | +if "function_call" in response: |
| 92 | + _result = add_two_numbers(**json.loads(response["function_call"]["arguments"])) |
| 93 | + print(f"Result: {_result}") |
| 94 | + _function_call_llm_response = { |
| 95 | + "role": "function", |
| 96 | + "name": response["function_call"]["name"], |
| 97 | + "content": f"Result: {_result}", |
| 98 | + } |
| 99 | + _messages.append(_function_call_llm_response) |
| 100 | + print(call_openai(_messages)) |
| 101 | +``` |
| 102 | + |
| 103 | +The above snippet will print the following: |
| 104 | + |
| 105 | +```text |
| 106 | +Result: 15 |
| 107 | +{ |
| 108 | + "role": "assistant", |
| 109 | + "content": "The sum of 5 and 10 is 15." |
| 110 | +} |
| 111 | +``` |
| 112 | + |
| 113 | +Let's break down the above snippet: |
| 114 | + |
| 115 | +- First we define a function `call_openai` that takes a list of messages and a list of functions to call. The function |
| 116 | + uses the `openai.ChatCompletion.create` API to call OpenAI and get a response. |
| 117 | +- Next we define a list of messages that we want to send to OpenAI. The first message is a system message that describes |
| 118 | + what the system does. The second message is a user message that tells the system what the user wants to do. |
| 119 | +- Next we define a list of functions that we want to expose to OpenAI. In this case we only have one function. |
| 120 | +- Next we call the `call_openai` function with the messages and functions. The response from OpenAI is stored in the |
| 121 | + `response` variable. |
| 122 | +- Next we check if the response contains a `function_call` key. If it does then we know that OpenAI has called our |
| 123 | + function and we can get the result from the `function_call` key. |
| 124 | +- Next we print the result of the function call. |
| 125 | +- Next we create a new message that contains the result of the function call and append it to the list of messages. |
| 126 | +- Finally we call the `call_openai` function again with the updated list of messages. This time OpenAI will respond with |
| 127 | + a message that contains the result of the function call. |
| 128 | + |
| 129 | +!!! note "Non-Production Example" |
| 130 | + |
| 131 | + The above is a naive example of how you can use the `func-ai` library to convert your python functions and use them |
| 132 | + with OpenAI. `func-ai` offer much more advanced mechanisms to help you build a production ready code. Please check |
| 133 | + other articles in the documentation to learn more or get in touch [with us](mailto:[email protected]) if you need help. |
| 134 | + |
| 135 | +## Working with `functools.partial` |
| 136 | + |
| 137 | +Python `functools` library offers the ability to create partial functions with some of the parameters already set. This |
| 138 | +is particularly useful in cases where you have either static parameter you want to configure, sensitive parameter such a |
| 139 | +secret or a state object (e.g. DB connection) in which case you either cannot or do not want to send that info to |
| 140 | +OpenAI. `partial` to the rescue! |
| 141 | + |
| 142 | +Let's create a new function called `query_db` where we want our DB driver to be a fixed parameter and not passed to the |
| 143 | +LLM: |
| 144 | + |
| 145 | +> Note: We make the assumption that `call_openai` function is already defined as per the previous example. |
| 146 | +
|
| 147 | +```python |
| 148 | +from functools import partial |
| 149 | +from func_ai.utils.py_function_parser import func_to_json |
| 150 | +import json |
| 151 | + |
| 152 | + |
| 153 | +def query_db(db_driver: object, query: str) -> str: |
| 154 | + """ |
| 155 | + Queries the database |
| 156 | + |
| 157 | + :param db_driver: The database driver to use |
| 158 | + :param query: The query to execute |
| 159 | + :return: The result of the query |
| 160 | + """ |
| 161 | + return f"Querying {db_driver} with query {query}" |
| 162 | + |
| 163 | + |
| 164 | +_partial_fun = partial(query_db, db_driver="MySQL") |
| 165 | +_json_fun = func_to_json(_partial_fun) |
| 166 | +_messages = [{"role": "system", |
| 167 | + "content": "You are a helpful automation system that helps users to perform a variety of supported tasks."}, |
| 168 | + {"role": "user", "content": "Query the db for quarterly sales."}] |
| 169 | +_functions = [_json_fun] |
| 170 | +response = call_openai(_messages, _functions) |
| 171 | +if "function_call" in response: |
| 172 | + _result = _partial_fun(**json.loads(response["function_call"]["arguments"])) |
| 173 | + print(f"Result: {_result}") |
| 174 | + _function_call_llm_response = { |
| 175 | + "role": "function", |
| 176 | + "name": response["function_call"]["name"], |
| 177 | + "content": f"Result: {_result}", |
| 178 | + } |
| 179 | + _messages.append(_function_call_llm_response) |
| 180 | + print(call_openai(_messages)) |
| 181 | +``` |
| 182 | + |
| 183 | +The above snippet will print the following: |
| 184 | + |
| 185 | +```text |
| 186 | +Result: Querying MySQL with query SELECT * FROM sales WHERE date >= '2021-01-01' AND date <= '2021-12-31' |
| 187 | +{ |
| 188 | + "role": "assistant", |
| 189 | + "content": "Here are the quarterly sales for the year 2021:\n\n1st Quarter: $XXX\n2nd Quarter: $XXX\n3rd Quarter: $XXX\n4th Quarter: $XXX\n\nPlease let me know if there's anything else I can assist you with!" |
| 190 | +} |
| 191 | +``` |
| 192 | + |
| 193 | +The example above is very similar to our previous example except that this time we have fixed the `db_driver` parameter |
| 194 | +which gives you that very important security and privacy aspect especially when playing around with LLMs on the open |
| 195 | +internet. |
| 196 | + |
| 197 | +## Function Wrapper |
| 198 | + |
| 199 | +`func-ai` also offers a function wrapper that you can use to wrap your functions and expose them to OpenAI. The wrapper |
| 200 | +takes care of all the heavy lifting for you. Here is a very short example of how you can use the wrapper: |
| 201 | + |
| 202 | +```python |
| 203 | +from dotenv import load_dotenv |
| 204 | +from func_ai.utils import OpenAIFunctionWrapper, OpenAIInterface |
| 205 | + |
| 206 | +load_dotenv() |
| 207 | + |
| 208 | + |
| 209 | +def say_hello(name: str): |
| 210 | + """ |
| 211 | + This is a function that says hello to the user |
| 212 | +
|
| 213 | + :param name: Name of the person to say hello to |
| 214 | + :return: |
| 215 | + """ |
| 216 | + print(f"Hello {name}!") |
| 217 | + |
| 218 | + |
| 219 | +_func_wrap = OpenAIFunctionWrapper.from_python_function(say_hello, OpenAIInterface()) |
| 220 | + |
| 221 | +_func_wrap.from_prompt("Say hello to John") |
| 222 | +``` |
| 223 | + |
| 224 | +The above snippet will print the following: |
| 225 | + |
| 226 | +```text |
| 227 | +Hello John! |
| 228 | +``` |
| 229 | + |
| 230 | +Let's break down the above snippet: |
| 231 | + |
| 232 | +- First we import the `load_dotenv` function from the `dotenv` library. This is used to load the environment variables |
| 233 | + from the `.env` file. |
| 234 | +- Next we import the `OpenAIFunctionWrapper` and `OpenAIInterface` classes from the `func_ai.utils` module. |
| 235 | +- Next we define a function called `say_hello` that takes a `name` parameter and prints `Hello {name}!` to the console. |
| 236 | +- Next we create an instance of the `OpenAIFunctionWrapper` class by calling the `from_python_function` method and |
| 237 | + passing in the `say_hello` function and an instance of the `OpenAIInterface` class. |
| 238 | +- Finally we call the `from_prompt` method on the `OpenAIFunctionWrapper` instance and pass in the prompt that we want to |
| 239 | + send to OpenAI. |
| 240 | + |
| 241 | +It is also possible to use partials with the wrapper like so: |
| 242 | + |
| 243 | +```python |
| 244 | +from functools import partial |
| 245 | +_func_wrap = OpenAIFunctionWrapper.from_python_function(partial(say_hello,name="World"), OpenAIInterface()) |
| 246 | + |
| 247 | +_func_wrap.from_prompt("Say hello") |
| 248 | +``` |
| 249 | + |
| 250 | +The above snippet will print the following: |
| 251 | + |
| 252 | +```text |
| 253 | +Hello World! |
| 254 | +``` |
| 255 | + |
| 256 | +!!! note "Further Examples" |
| 257 | + |
| 258 | + For more examples check jupyter notebooks in the `tests/jupyter/` folder. |
0 commit comments