From cb8078bf6c35eae9ca92da4f4e7614c2f89ead72 Mon Sep 17 00:00:00 2001 From: lpm0073 Date: Sun, 28 Jan 2024 16:25:51 -0600 Subject: [PATCH 1/2] style: rename 'custom configuration' references to 'plugin' --- CHANGELOG.md | 5 ++--- api/terraform/python/openai_api/README.md | 2 +- .../python/openai_api/lambda_openai_function/README.md | 10 +++++----- .../config/example-configuration.yaml | 4 ++-- .../lambda_openai_function/lambda_handler.py | 6 ++---- .../openai_api/lambda_openai_function/plugin_loader.py | 2 +- .../lambda_openai_function/plugin_manager.py | 2 +- 7 files changed, 14 insertions(+), 17 deletions(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index e907e283..6b07b0ac 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,9 +1,8 @@ ## [0.10.5](https://github.com/FullStackWithLawrence/aws-openai/compare/v0.10.4...v0.10.5) (2024-01-28) - ### Bug Fixes -* force a new release ([5b1ee95](https://github.com/FullStackWithLawrence/aws-openai/commit/5b1ee95dc155b93f04ad79d66ec01f3a6852d60c)) +- force a new release ([5b1ee95](https://github.com/FullStackWithLawrence/aws-openai/commit/5b1ee95dc155b93f04ad79d66ec01f3a6852d60c)) # Change Log @@ -18,7 +17,7 @@ OpenAI 'Function Calling' Lambda. ### Refactor - Pydantic refactor ([ad39079](https://github.com/FullStackWithLawrence/aws-openai/commit/ad39079e2142368d7ab2d19360da2dcd2a034120)). [custom_config.py](./api/terraform/python/openai_api/lambda_openai_function/custom_config.py) now inherits from Pydantic BaseModel. -- Incremental development of the yaml file standard for custom configurations. This now has three well-defined for meta_data, prompting, function_calling. +- Incremental development of the yaml file standard for plugins. This now has three well-defined for meta_data, prompting, function_calling. - Added remote AWS S3 bucket support for custom config yaml file storage. - Liam has replaced Marv as the default chatbot. diff --git a/api/terraform/python/openai_api/README.md b/api/terraform/python/openai_api/README.md index 1b39f8ea..4efdd4fc 100644 --- a/api/terraform/python/openai_api/README.md +++ b/api/terraform/python/openai_api/README.md @@ -20,7 +20,7 @@ A general purpose handler for the OpenAI API via Langchain. This is the primary ## lambda_openai_function -An adaptive ChatGPT interface that uses a combination of dynamic prompting and [Function Calling](https://platform.openai.com/docs/guides/function-calling) to create highly customized ChatGPT responses to user prompts. See these [example custom configurations](../openai_api/lambda_openai_function/config/) demonstrating some of the exciting things you can implement with this feature. This module leverages [Pydantic](https://docs.pydantic.dev/latest/) to validate the yaml custom configuration files that drive the behavior of this function. +An adaptive ChatGPT interface that uses a combination of dynamic prompting and [Function Calling](https://platform.openai.com/docs/guides/function-calling) to create highly customized ChatGPT responses to user prompts. See these [example plugins](../openai_api/lambda_openai_function/config/) demonstrating some of the exciting things you can implement with this feature. This module leverages [Pydantic](https://docs.pydantic.dev/latest/) to validate the yaml plugin files that drive the behavior of this function. ## lambda_openai_v2 diff --git a/api/terraform/python/openai_api/lambda_openai_function/README.md b/api/terraform/python/openai_api/lambda_openai_function/README.md index 65b8790a..f0053a34 100644 --- a/api/terraform/python/openai_api/lambda_openai_function/README.md +++ b/api/terraform/python/openai_api/lambda_openai_function/README.md @@ -14,11 +14,11 @@ Fully implements the "[get_current_weather()](https://platform.openai.com/docs/g ## plugin.py -This module demonstrates an alternative implementation of prompt behavior modification involving both Function Calling, plus, dynamic modifications to the system prompt. The module passes a customized configuration object to `function_calling_plugin()` based on a configurable set of search terms that it looks for in the user prompt. The function works with multiple customized configurations. That is, it maintains a list of custom configurations, and user prompts including search terms associated with multiple custom configurations will result in prompt configuration multiple "Function Calling" apis. The custom configurations are persisted both inside this repository in the [config](./config/) folder as well as via a remote AWS S3 bucket that Terraform creates and configures for you automatically. Custom configurations are data-driven via a standardized yaml format. Use [example-configuration.yaml](./config/example-configuration.yaml) as a template to create your own custom configurations. Storing these in the AWS S3 bucket is preferable to keeping these inside your repo. +This module demonstrates an alternative implementation of prompt behavior modification involving both Function Calling, plus, dynamic modifications to the system prompt. The module passes a customized configuration object to `function_calling_plugin()` based on a configurable set of search terms that it looks for in the user prompt. The function works with multiple customized configurations. That is, it maintains a list of plugins, and user prompts including search terms associated with multiple plugins will result in prompt configuration multiple "Function Calling" apis. The plugins are persisted both inside this repository in the [config](./config/) folder as well as via a remote AWS S3 bucket that Terraform creates and configures for you automatically. Custom configurations are data-driven via a standardized yaml format. Use [example-configuration.yaml](./config/example-configuration.yaml) as a template to create your own plugins. Storing these in the AWS S3 bucket is preferable to keeping these inside your repo. -### Example custom configurations +### Example plugins -The following two sample custom configurations are included in this project: +The following two sample plugins are included in this project: 1. [Everlasting Gobstopper](./config/everlasting-gobstopper.yaml): An example of a consumer product, complete with pricing information and coupon codes. 2. [Lawrence McDaniel](./config/lawrence-mcdaniel.yaml): Similar in functionality to a personal web site, this configuration demonstrates how you can get ChatGPT to showcase your professional profile, including your job and project history, your project portfolio, skill set and context-sensitive contact information. @@ -29,7 +29,7 @@ The following two sample custom configurations are included in this project: meta_data: config_path: aws_openai/lambda_openai_function/custom_configs/example-configuration.yaml name: ExampleConfiguration - description: an example custom configuration. + description: an example plugin. version: 0.1.0 author: Lawrence McDaniel prompting: @@ -45,7 +45,7 @@ prompting: Your job is to provide helpful technical information about the OpenAI API Function Calling feature. You should include the following information in your response: "Congratulations!!! OpenAI API Function Calling chose to call this function. Here is the additional information that you requested:" function_calling: - function_description: an example custom configuration to integrate with OpenAI API Function Calling additional information function, in this module. + function_description: an example plugin to integrate with OpenAI API Function Calling additional information function, in this module. additional_information: about: > This is some sample text that will be returned ChatGPT if it opts to invoke the function_calling_plugin() function. diff --git a/api/terraform/python/openai_api/lambda_openai_function/config/example-configuration.yaml b/api/terraform/python/openai_api/lambda_openai_function/config/example-configuration.yaml index d1076081..a05ee232 100644 --- a/api/terraform/python/openai_api/lambda_openai_function/config/example-configuration.yaml +++ b/api/terraform/python/openai_api/lambda_openai_function/config/example-configuration.yaml @@ -6,7 +6,7 @@ meta_data: config_path: aws_openai/lambda_openai_function/custom_configs/example-configuration.yaml name: ExampleConfiguration - description: A 'hello world' style custom configuration. This is an example custom configuration to integrate with OpenAI API Function Calling additional information function, in this module. + description: A 'hello world' style plugin. This is an example plugin to integrate with OpenAI API Function Calling additional information function, in this module. version: 0.1.0 author: Lawrence McDaniel @@ -44,7 +44,7 @@ prompting: # own discretion. # ------------------------------------------------------------ function_calling: - function_description: an example custom configuration to integrate with OpenAI API Function Calling additional information function, in this module. + function_description: an example plugin to integrate with OpenAI API Function Calling additional information function, in this module. #------------------------------------------------------------ # if a.) this module is able to locate any of the search terms in the user prompt # b.) OpenAI API Function Calling opts to call this function diff --git a/api/terraform/python/openai_api/lambda_openai_function/lambda_handler.py b/api/terraform/python/openai_api/lambda_openai_function/lambda_handler.py index 7f73ebf0..c534f4ec 100644 --- a/api/terraform/python/openai_api/lambda_openai_function/lambda_handler.py +++ b/api/terraform/python/openai_api/lambda_openai_function/lambda_handler.py @@ -76,7 +76,7 @@ def handler(event, context): object_type, model, messages, input_text, temperature, max_tokens = parse_request(request_body) request_meta_data = request_meta_data_factory(model, object_type, temperature, max_tokens, input_text) - # does the prompt have anything to do with any of the search terms defined in a custom configuration? + # does the prompt have anything to do with any of the search terms defined in a plugin? for config in plugins: if search_terms_are_in_messages( messages=messages, @@ -87,9 +87,7 @@ def handler(event, context): messages = customized_prompt(config=config, messages=messages) custom_tool = plugin_tool_factory(config=config) tools.append(custom_tool) - print( - f"Adding custom configuration: {config.name} {config.meta_data.version} created by {config.meta_data.author}" - ) + print(f"Adding plugin: {config.name} {config.meta_data.version} created by {config.meta_data.author}") # https://platform.openai.com/docs/guides/gpt/chat-completions-api validate_item( diff --git a/api/terraform/python/openai_api/lambda_openai_function/plugin_loader.py b/api/terraform/python/openai_api/lambda_openai_function/plugin_loader.py index e1f62023..10e81699 100644 --- a/api/terraform/python/openai_api/lambda_openai_function/plugin_loader.py +++ b/api/terraform/python/openai_api/lambda_openai_function/plugin_loader.py @@ -363,7 +363,7 @@ def __init__(self, config_path: str = None, aws_s3_bucket_name: str = None): plugin = Plugin(config_json=config_json, index=i) self._custom_configs.append(plugin) # print( - # f"Loaded custom configuration from AWS S3 bucket: {plugin.name} {plugin.meta_data.version} created by {plugin.meta_data.author}" + # f"Loaded plugin from AWS S3 bucket: {plugin.name} {plugin.meta_data.version} created by {plugin.meta_data.author}" # ) @property diff --git a/api/terraform/python/openai_api/lambda_openai_function/plugin_manager.py b/api/terraform/python/openai_api/lambda_openai_function/plugin_manager.py index 677c49a9..3285bbce 100644 --- a/api/terraform/python/openai_api/lambda_openai_function/plugin_manager.py +++ b/api/terraform/python/openai_api/lambda_openai_function/plugin_manager.py @@ -34,7 +34,7 @@ def search_terms_are_in_messages(messages: list, search_terms: list, search_pair def customized_prompt(config: Plugin, messages: list) -> list: - """Modify the system prompt based on the custom configuration object""" + """Modify the system prompt based on the plugin object""" for i, message in enumerate(messages): if message.get("role") == "system": From cb2a8a8bf353c44673e6661c87bb70b2b87d63eb Mon Sep 17 00:00:00 2001 From: lpm0073 Date: Sun, 28 Jan 2024 16:26:41 -0600 Subject: [PATCH 2/2] style: rename 'custom configuration' references to 'plugin' --- doc/GOOD_CODING_PRACTICE.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/GOOD_CODING_PRACTICE.md b/doc/GOOD_CODING_PRACTICE.md index 928b90be..734dceb7 100644 --- a/doc/GOOD_CODING_PRACTICE.md +++ b/doc/GOOD_CODING_PRACTICE.md @@ -8,7 +8,7 @@ This project includes an extensive collection of Python unit tests for verifying ## Pydantic -Originally created in 2017, [Pydantic](https://docs.pydantic.dev/latest/) has become the most widely used data validation library for Python. It is especially useful for data driven applications like this one, involving frequent integrations with a variety of cloud infrastructure services in a variety of environments, configured by a variety of different possible sources of data including environment variables, .env file, terraform.tfvars and system constants. We use it for the [Settings](../api/terraform/python/openai_api/common/conf.py) class in this project, and also for validating yaml [custom configurations](.api/terraform/python/openai_api/lambda_openai_function/custom_config.py) for the OpenAI Function Calling feature. It's an important addition because it enforces strong type and business rule validation checking of all of the configuration parameters for the AWS Lambdas, and it ensures that nothing ever changes these values at run-time once they've been set. And this in turn is important because erroneous automation code could otherwise lead to some wildly disastrous results. 😳 +Originally created in 2017, [Pydantic](https://docs.pydantic.dev/latest/) has become the most widely used data validation library for Python. It is especially useful for data driven applications like this one, involving frequent integrations with a variety of cloud infrastructure services in a variety of environments, configured by a variety of different possible sources of data including environment variables, .env file, terraform.tfvars and system constants. We use it for the [Settings](../api/terraform/python/openai_api/common/conf.py) class in this project, and also for validating yaml [plugins](.api/terraform/python/openai_api/lambda_openai_function/custom_config.py) for the OpenAI Function Calling feature. It's an important addition because it enforces strong type and business rule validation checking of all of the configuration parameters for the AWS Lambdas, and it ensures that nothing ever changes these values at run-time once they've been set. And this in turn is important because erroneous automation code could otherwise lead to some wildly disastrous results. 😳 ## Automations