Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: support bedrock api in job and intention #525

Merged
merged 6 commits into from
Jan 23, 2025
Merged

feat: support bedrock api in job and intention #525

merged 6 commits into from
Jan 23, 2025

Conversation

NingLu
Copy link
Collaborator

@NingLu NingLu commented Jan 23, 2025

Fixes #

🤖 AI-Generated PR Description (Powered by Amazon Bedrock)

Description

This pull request includes updates to various components of the project's infrastructure, lambda functions, and dependencies. The main changes are:

  • Updates to the configuration and knowledge base management in the infrastructure layer.
  • Modifications to the ETL and intention lambda functions, including updates to their dependencies.
  • An issue with the llm_bot_dep package, which needs to be resolved.

The motivation behind these changes is to enhance the functionality, fix potential bugs, and keep the project up-to-date with the latest dependencies.

Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

File Stats Summary

File number involved in this PR: 11, unfold to see the details:

The file changes summary is as follows:

Files
Changes
Change Summary
source/infrastructure/bin/config.ts 0 added, 6 removed The code change disables the API inference configuration and removes related properties, while enabling the use of an open-source language model for chat functionality.
source/infrastructure/lib/shared/types.ts 0 added, 6 removed This code change removes the apiInference object from the SystemConfig interface, which likely disables the use of an external API for inference tasks in the application.
source/lambda/job/dep/setup.py 1 added, 0 removed The code change adds the "tiktoken==0.8.0" package to the list of required packages for the project.
source/lambda/etl/main.py 3 added, 0 removed The code changes introduce a new parameter groupName to the lambda function, which is retrieved from the event object and passed along with other parameters to subsequent processing steps.
source/lambda/intention/requirements.txt 3 added, 1 removed The code changes involve upgrading the openpyxl library and adding two new dependencies: openai and tiktoken.
source/infrastructure/lib/api/intention-management.ts 0 added, 4 removed This code change removes environment variables related to API inference configuration, such as enabling/disabling API inference, provider, endpoint, and API key ARN, from the Lambda function.
source/infrastructure/lib/knowledge-base/knowledge-base-stack.ts 6 added, 5 removed The code changes add DynamoDB permissions to the Glue role, update the list of Python modules with tiktoken, add a groupName parameter for Glue job input, and remove API inference configurations in favor of a model table.
source/lambda/job/glue-job-script.py 7 added, 12 removed The code changes remove variables related to API inference (API_INFERENCE_ENABLED, API_INFERENCE_PROVIDER, API_ENDPOINT, API_KEY_ARN) and add new variables MODEL_TABLE and GROUP_NAME for managing models and groups.
source/lambda/job/dep/llm_bot_dep/sm_utils.py 52 added, 9 removed The code changes add a new function get_model_details to retrieve model details from a DynamoDB table using the group name, chatbot ID, and table name. It also modifies the getCustomEmbeddings function to fetch model provider, base URL, and API key ARN from the retrieved model details instead of using hardcoded values.
source/lambda/intention/intention.py 12 added, 9 removed The code changes remove environment variables related to API inference (API_INFERENCE_ENABLED, API_INFERENCE_PROVIDER, API_ENDPOINT, API_KEY_ARN) and add group_name and chatbotId parameters to __save_2_aos function call.
source/infrastructure/cli/magic-config.ts 0 added, 96 removed This code change removes the options for API-based inference and embedding generation from the OpenAI API, and instead focuses on using local models and embeddings.

@NingLu NingLu merged commit e5513bb into dev Jan 23, 2025
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant