Skip to content

Commit

Permalink
style: rename 'custom configuration' references to 'plugin'
Browse files Browse the repository at this point in the history
  • Loading branch information
lpm0073 committed Jan 28, 2024
1 parent 7139930 commit cb8078b
Show file tree
Hide file tree
Showing 7 changed files with 14 additions and 17 deletions.
5 changes: 2 additions & 3 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,8 @@
## [0.10.5](https://github.com/FullStackWithLawrence/aws-openai/compare/v0.10.4...v0.10.5) (2024-01-28)


### Bug Fixes

* force a new release ([5b1ee95](https://github.com/FullStackWithLawrence/aws-openai/commit/5b1ee95dc155b93f04ad79d66ec01f3a6852d60c))
- force a new release ([5b1ee95](https://github.com/FullStackWithLawrence/aws-openai/commit/5b1ee95dc155b93f04ad79d66ec01f3a6852d60c))

# Change Log

Expand All @@ -18,7 +17,7 @@ OpenAI 'Function Calling' Lambda.
### Refactor

- Pydantic refactor ([ad39079](https://github.com/FullStackWithLawrence/aws-openai/commit/ad39079e2142368d7ab2d19360da2dcd2a034120)). [custom_config.py](./api/terraform/python/openai_api/lambda_openai_function/custom_config.py) now inherits from Pydantic BaseModel.
- Incremental development of the yaml file standard for custom configurations. This now has three well-defined for meta_data, prompting, function_calling.
- Incremental development of the yaml file standard for plugins. This now has three well-defined for meta_data, prompting, function_calling.
- Added remote AWS S3 bucket support for custom config yaml file storage.
- Liam has replaced Marv as the default chatbot.

Expand Down
2 changes: 1 addition & 1 deletion api/terraform/python/openai_api/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ A general purpose handler for the OpenAI API via Langchain. This is the primary

## lambda_openai_function

An adaptive ChatGPT interface that uses a combination of dynamic prompting and [Function Calling](https://platform.openai.com/docs/guides/function-calling) to create highly customized ChatGPT responses to user prompts. See these [example custom configurations](../openai_api/lambda_openai_function/config/) demonstrating some of the exciting things you can implement with this feature. This module leverages [Pydantic](https://docs.pydantic.dev/latest/) to validate the yaml custom configuration files that drive the behavior of this function.
An adaptive ChatGPT interface that uses a combination of dynamic prompting and [Function Calling](https://platform.openai.com/docs/guides/function-calling) to create highly customized ChatGPT responses to user prompts. See these [example plugins](../openai_api/lambda_openai_function/config/) demonstrating some of the exciting things you can implement with this feature. This module leverages [Pydantic](https://docs.pydantic.dev/latest/) to validate the yaml plugin files that drive the behavior of this function.

## lambda_openai_v2

Expand Down
10 changes: 5 additions & 5 deletions api/terraform/python/openai_api/lambda_openai_function/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,11 @@ Fully implements the "[get_current_weather()](https://platform.openai.com/docs/g

## plugin.py

This module demonstrates an alternative implementation of prompt behavior modification involving both Function Calling, plus, dynamic modifications to the system prompt. The module passes a customized configuration object to `function_calling_plugin()` based on a configurable set of search terms that it looks for in the user prompt. The function works with multiple customized configurations. That is, it maintains a list of custom configurations, and user prompts including search terms associated with multiple custom configurations will result in prompt configuration multiple "Function Calling" apis. The custom configurations are persisted both inside this repository in the [config](./config/) folder as well as via a remote AWS S3 bucket that Terraform creates and configures for you automatically. Custom configurations are data-driven via a standardized yaml format. Use [example-configuration.yaml](./config/example-configuration.yaml) as a template to create your own custom configurations. Storing these in the AWS S3 bucket is preferable to keeping these inside your repo.
This module demonstrates an alternative implementation of prompt behavior modification involving both Function Calling, plus, dynamic modifications to the system prompt. The module passes a customized configuration object to `function_calling_plugin()` based on a configurable set of search terms that it looks for in the user prompt. The function works with multiple customized configurations. That is, it maintains a list of plugins, and user prompts including search terms associated with multiple plugins will result in prompt configuration multiple "Function Calling" apis. The plugins are persisted both inside this repository in the [config](./config/) folder as well as via a remote AWS S3 bucket that Terraform creates and configures for you automatically. Custom configurations are data-driven via a standardized yaml format. Use [example-configuration.yaml](./config/example-configuration.yaml) as a template to create your own plugins. Storing these in the AWS S3 bucket is preferable to keeping these inside your repo.

### Example custom configurations
### Example plugins

The following two sample custom configurations are included in this project:
The following two sample plugins are included in this project:

1. [Everlasting Gobstopper](./config/everlasting-gobstopper.yaml): An example of a consumer product, complete with pricing information and coupon codes.
2. [Lawrence McDaniel](./config/lawrence-mcdaniel.yaml): Similar in functionality to a personal web site, this configuration demonstrates how you can get ChatGPT to showcase your professional profile, including your job and project history, your project portfolio, skill set and context-sensitive contact information.
Expand All @@ -29,7 +29,7 @@ The following two sample custom configurations are included in this project:
meta_data:
config_path: aws_openai/lambda_openai_function/custom_configs/example-configuration.yaml
name: ExampleConfiguration
description: an example custom configuration.
description: an example plugin.
version: 0.1.0
author: Lawrence McDaniel
prompting:
Expand All @@ -45,7 +45,7 @@ prompting:
Your job is to provide helpful technical information about the OpenAI API Function Calling feature. You should include the following information in your response:
"Congratulations!!! OpenAI API Function Calling chose to call this function. Here is the additional information that you requested:"
function_calling:
function_description: an example custom configuration to integrate with OpenAI API Function Calling additional information function, in this module.
function_description: an example plugin to integrate with OpenAI API Function Calling additional information function, in this module.
additional_information:
about: >
This is some sample text that will be returned ChatGPT if it opts to invoke the function_calling_plugin() function.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
meta_data:
config_path: aws_openai/lambda_openai_function/custom_configs/example-configuration.yaml
name: ExampleConfiguration
description: A 'hello world' style custom configuration. This is an example custom configuration to integrate with OpenAI API Function Calling additional information function, in this module.
description: A 'hello world' style plugin. This is an example plugin to integrate with OpenAI API Function Calling additional information function, in this module.
version: 0.1.0
author: Lawrence McDaniel

Expand Down Expand Up @@ -44,7 +44,7 @@ prompting:
# own discretion.
# ------------------------------------------------------------
function_calling:
function_description: an example custom configuration to integrate with OpenAI API Function Calling additional information function, in this module.
function_description: an example plugin to integrate with OpenAI API Function Calling additional information function, in this module.
#------------------------------------------------------------
# if a.) this module is able to locate any of the search terms in the user prompt
# b.) OpenAI API Function Calling opts to call this function
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ def handler(event, context):
object_type, model, messages, input_text, temperature, max_tokens = parse_request(request_body)
request_meta_data = request_meta_data_factory(model, object_type, temperature, max_tokens, input_text)

# does the prompt have anything to do with any of the search terms defined in a custom configuration?
# does the prompt have anything to do with any of the search terms defined in a plugin?
for config in plugins:
if search_terms_are_in_messages(
messages=messages,
Expand All @@ -87,9 +87,7 @@ def handler(event, context):
messages = customized_prompt(config=config, messages=messages)
custom_tool = plugin_tool_factory(config=config)
tools.append(custom_tool)
print(
f"Adding custom configuration: {config.name} {config.meta_data.version} created by {config.meta_data.author}"
)
print(f"Adding plugin: {config.name} {config.meta_data.version} created by {config.meta_data.author}")

# https://platform.openai.com/docs/guides/gpt/chat-completions-api
validate_item(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -363,7 +363,7 @@ def __init__(self, config_path: str = None, aws_s3_bucket_name: str = None):
plugin = Plugin(config_json=config_json, index=i)
self._custom_configs.append(plugin)
# print(
# f"Loaded custom configuration from AWS S3 bucket: {plugin.name} {plugin.meta_data.version} created by {plugin.meta_data.author}"
# f"Loaded plugin from AWS S3 bucket: {plugin.name} {plugin.meta_data.version} created by {plugin.meta_data.author}"
# )

@property
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ def search_terms_are_in_messages(messages: list, search_terms: list, search_pair


def customized_prompt(config: Plugin, messages: list) -> list:
"""Modify the system prompt based on the custom configuration object"""
"""Modify the system prompt based on the plugin object"""

for i, message in enumerate(messages):
if message.get("role") == "system":
Expand Down

0 comments on commit cb8078b

Please sign in to comment.