Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(vertex-ai): function calling: convert markdown 'samples' to python #12811

Open
wants to merge 14 commits into
base: main
Choose a base branch
from

Conversation

Valeriy-Burlaka
Copy link
Member

@Valeriy-Burlaka Valeriy-Burlaka commented Dec 2, 2024

Description

Convert Markdown 'samples' for Function Calling Guides & API Reference pages:

  • The entire "How to create a function calling application" section (a tutorial-style series of steps that together compile into a function-calling application)
  • The "Example syntax" section for the API Reference page.

Checklist

@Valeriy-Burlaka Valeriy-Burlaka self-assigned this Dec 2, 2024
@Valeriy-Burlaka Valeriy-Burlaka requested review from a team as code owners December 2, 2024 15:21
@Valeriy-Burlaka Valeriy-Burlaka marked this pull request as draft December 2, 2024 15:21
Copy link

snippet-bot bot commented Dec 2, 2024

Here is the summary of changes.

You are about to add 9 region tags.

This comment is generated by snippet-bot.
If you find problems with this result, please file an issue at:
https://github.com/googleapis/repo-automation-bots/issues.
To update this comment, add snippet-bot:force-run label or use the checkbox below:

  • Refresh this comment

@product-auto-label product-auto-label bot added the samples Issues that are directly related to samples. label Dec 2, 2024
@@ -39,7 +39,7 @@ def generate_function_call() -> GenerationResponse:
vertexai.init(project=PROJECT_ID, location="us-central1")

# Initialize Gemini model
model = GenerativeModel("gemini-1.5-flash-002")
model = GenerativeModel("gemini-1.5-pro-002")
Copy link
Member Author

@Valeriy-Burlaka Valeriy-Burlaka Dec 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gemini Flash is not working anymore with this code sample (see the log):

  File "/workspace/generative_ai/function_calling/test_function_calling.py", line 31, in test_function_calling
    response = basic_example.generate_function_call()
  File "/workspace/generative_ai/function_calling/basic_example.py", line 75, in generate_function_call
    function_call = response.candidates[0].function_calls[0]
IndexError: list index out of range

The actual response from the model is now:

candidates {
  content {
    role: "model"
    parts {
      text: "I cannot answer this question. The available function `get_current_weather` has no implementation."
    }
  }
  avg_logprobs: -0.17872051000595093
  finish_reason: STOP
}

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Created a general issue to initiate the discussion about current CI process.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for reporting this, Valeriy-Burlaka. The Gemini Flash model appears to have been deprecated. The change to gemini-1.5-pro-002 in the pull request reflects the current, supported model. Your created issue is the appropriate place to track the broader CI implications.

def function_declaration_from_func():
# [START generativeaionvertexai_gemini_function_calling_declare_from_function]
# Define a function. Could be a local function or you can import the requests library to call an API
def multiply_numbers(numbers: List[int]) -> int:
Copy link
Member Author

@Valeriy-Burlaka Valeriy-Burlaka Dec 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is almost a copy of the original function declaration.
I added the typings to fix the error 400 that was lurking in this doc sample:

...
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
        status = StatusCode.INVALID_ARGUMENT
        details = "Unable to submit request because one or more function parameters didn't specify the schema type field. Learn more: https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling"
        debug_error_string = "UNKNOWN:Error received from peer ipv4:142.250.75.10:443 {created_time:"2024-12-02T15:20:19.676923+01:00", grpc_status:3, grpc_message:"Unable to submit request because one or more function parameters didn\'t specify the schema type field. Learn more: https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling"}"
>
...
google.api_core.exceptions.InvalidArgument: 400 Unable to submit request because one or more function parameters didn't specify the schema type field. Learn more: https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling

The unit-test utilizing a chat message is necessary to test correctness of the function declaration.

@@ -52,12 +55,14 @@ def test_function_calling_advanced_function_selection() -> None:
)


@pytest.mark.skip(reason="Blocked on b/... ")
Copy link
Member Author

@Valeriy-Burlaka Valeriy-Burlaka Dec 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both samples that use OpenAI client are not working anymore (log):

  File "/workspace/generative_ai/function_calling/test_function_calling.py", line 59, in test_function_calling_basic
    response = chat_function_calling_basic.generate_text()
  File "/workspace/generative_ai/function_calling/chat_function_calling_basic.py", line 39, in generate_text
    client = openai.OpenAI(
  File "/workspace/generative_ai/function_calling/.nox/py-3-9/lib/python3.9/site-packages/openai/_client.py", line 122, in __init__
    super().__init__(
  File "/workspace/generative_ai/function_calling/.nox/py-3-9/lib/python3.9/site-packages/openai/_base_client.py", line 825, in __init__
    self._client = http_client or SyncHttpxClientWrapper(
  File "/workspace/generative_ai/function_calling/.nox/py-3-9/lib/python3.9/site-packages/openai/_base_client.py", line 723, in __init__
    super().__init__(**kwargs)
TypeError: __init__() got an unexpected keyword argument 'proxies'

- generated xml file: /workspace/generative_ai/function_calling/sponge_log.xml -
=========================== short test summary info ============================
FAILED test_function_calling.py::test_function_calling_basic - TypeError: __i...
FAILED test_function_calling.py::test_function_calling_config - TypeError: __...
========================= 2 failed, 5 passed in 14.85s =========================

Link to the doc section with OpenAI samples: https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/function-calling#python-openai

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Created a general issue to initiate the discussion about current CI process.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error TypeError: __init__() got an unexpected keyword argument 'proxies' in the OpenAI client indicates a version mismatch or incompatibility with the openai library and the way it handles proxy settings. The proxies argument was likely added in a newer version. To resolve this, you should:

  1. Check the openai library version: Ensure you're using a version compatible with your current environment. Check your requirements.txt and pyproject.toml (if used) for the specified version. If it's outdated, update it to the latest version that supports the proxies argument (or a version that doesn't require it if you're not using proxies).
  2. Review proxy configuration: If you are using proxies, double-check your proxy settings in your environment variables or configuration files. Incorrectly formatted proxy settings can also cause this error. If you are not using proxies, remove any proxy settings from your code or environment.
  3. Virtual environment consistency: Ensure your testing environment is consistent with your development environment. Use the same virtual environment or container for both to avoid version conflicts.
  4. Dependency resolution: Try running pip install --upgrade pip and then pip install -r requirements.txt (or your equivalent package manager command) to ensure all dependencies are correctly resolved and updated.

The issue you've created is a good first step to track this problem. Consider adding more details about your environment (Python version, OpenAI library version, proxy configuration if applicable) to help diagnose the root cause.



def function_declaration_from_dict() -> object:
# [START generativeaionvertexai_gemini_function_calling_declare_from_dict1]
Copy link
Member Author

@Valeriy-Burlaka Valeriy-Burlaka Dec 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@msampathkumar , I know we want only one pair of region tags per sample file, but just look at this doc section — we will have to add 8 new micro-files to convert all markdown used there 🤯 . Given that all markdown samples for "Step 1-5" sub-sections could basically compose into 1 coherent sample with different region tags, maybe we could make an exception from the "1 region tag" rule here? 🤔

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure! We can do that. Let us catchup on this today.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@msampathkumar , please check the updated sample.
Following the idea of the "How to ..." section (creating an example app block by block), I compiled it into one code sample and added tests. As it runs together, it now lives together, in one file!

@Valeriy-Burlaka Valeriy-Burlaka changed the title feat: convert markdown sample for function calling to python feat(gen-ai): function calling: convert markdown declarations to python Dec 2, 2024
@Valeriy-Burlaka Valeriy-Burlaka added the api: vertex-ai Issues related to the Vertex AI API. label Dec 2, 2024
@Valeriy-Burlaka Valeriy-Burlaka changed the title feat(gen-ai): function calling: convert markdown declarations to python feat(vertex-ai): function calling: convert markdown declarations to python Dec 2, 2024
@Valeriy-Burlaka Valeriy-Burlaka changed the title feat(vertex-ai): function calling: convert markdown declarations to python feat(vertex-ai): function calling: convert markdown func declarations to python Dec 2, 2024
@Valeriy-Burlaka Valeriy-Burlaka marked this pull request as ready for review December 2, 2024 19:43
@Valeriy-Burlaka Valeriy-Burlaka marked this pull request as draft December 2, 2024 19:43
@Valeriy-Burlaka Valeriy-Burlaka marked this pull request as ready for review December 2, 2024 22:15
@Valeriy-Burlaka Valeriy-Burlaka marked this pull request as draft December 2, 2024 22:16
@Valeriy-Burlaka Valeriy-Burlaka marked this pull request as ready for review December 2, 2024 22:17
@Valeriy-Burlaka Valeriy-Burlaka marked this pull request as draft December 5, 2024 14:35
@backoff.on_exception(backoff.expo, ResourceExhausted, max_time=10)
def test_function_calling_basic() -> None:
response = chat_function_calling_basic.generate_text()
assert "get_current_weather" in response.choices[0].message.tool_calls[0].id


@pytest.mark.skip(reason="Blocked on b/... ")
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will need help with creating a real ticket in Buganizer.

response = chat_session.send_message("What will be 1 multiplied by 2?")
assert response is not None

extract_sales_prompt = """
Copy link
Member Author

@Valeriy-Burlaka Valeriy-Burlaka Dec 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am adding a test for each toolbox function created by this sample because of this reason. The toolbox is lazily evaluated, and it's not enough to simply create a function declaration to check that it's working.
Only when we trigger a "function call" response from the model with a target prompt it validates the declaration and can throw an error.

@Valeriy-Burlaka Valeriy-Burlaka marked this pull request as ready for review December 6, 2024 15:22
PROJECT_ID = os.getenv("GOOGLE_CLOUD_PROJECT")


def create_app() -> object:
Copy link
Member Author

@Valeriy-Burlaka Valeriy-Burlaka Dec 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consolidates code samples for the https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling#how-works doc section into a single, testable script.

Each sub-step of this section presents a discrete code snippet (block) for the final application, demonstrating a single function calling concept at a time. This compilation creates a complete, maintainable implementation for DevRel purposes, reducing the amount of separate files we'd have if we keep each snippet in a separate file.
The original snippets and their documentation order are preserved through separate region tags, with (minor fixes) applied.

@Valeriy-Burlaka Valeriy-Burlaka changed the title feat(vertex-ai): function calling: convert markdown func declarations to python feat(vertex-ai): function calling: convert markdown 'samples' to python Dec 10, 2024
# [START generativeaionvertexai_function_calling_example_syntax]
from vertexai.generative_models import FunctionDeclaration, GenerationConfig, GenerativeModel, Tool

gemini_model = GenerativeModel(
Copy link
Member Author

@Valeriy-Burlaka Valeriy-Burlaka Dec 10, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried to preserve the "spirit" of the original .md sample (i.e., a very short snippet that only shows how to declare a model with toolbox), while making it actually copyable, testable, and working. That's why I even moved the init call outside of the sample but added all necessary imports for proper functioning.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: vertex-ai Issues related to the Vertex AI API. samples Issues that are directly related to samples.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants