The Observability Integration Fields Insight project bridges gaps in the Observability integration field details. The amount of details added as part of the fields description is often limited due to various reasons including
- the number of fields added as part of an integration
- developers or contributors might assume that the users of the integration already familiar with the observed fields, leading to less emphasis on the documentation
- developers may rely on the documentation of the services or products for which integration is built
- with a large number of fields, writing detailed descriptions for each one can become a repetitive and low-priority task.
This project provides extended descriptions for each field, highlighting their relevance and importance in determining service or product availability. Using the name of the field and field description, it pulls insights from a variety of online sources, storing them in databases for reuse. The data is organized into a report schema, supporting both QA and decision-making needs.
Warning!!!: Since the report content generated by running this project ( also used to generate alert threshold or slo settings, alert baselining, etc), is using the contents from random websites, it is important the users validate the sources of these contents (mentioned the report) and the correctness of data before using for Q/A or taking decisions or to be used in other applications.
It provides advanced insights into observability fields, supporting diverse use cases such as generating asset presets for Alerts and SLOs, extending the knowledge base of Elastic's Observability AI Assistant. With a modular architecture, it operates independently or integrates seamlessly with Elastic Observability to enrich its capabilities.
By generating recommendations for thresholds, critical metrics, and field relationships, the project simplifies alert creation and validation. It also extends the Elastic's Observability AI Assistant AI assistant’s question-answering capabilities, empowering users to streamline workflows and enhance the observability ecosystem.
python 3.11+
and uses poetry for package dependencies- Elasticsearch cluster deployed on Elasticsearch Service or download and run Elasticsearch. Create a key in Kibana or using the Elasticsearch API
- Visit the OpenAI API platform, sign in, and navigate to the "API Keys" section under your account settings to create a new API key.
- Visit the Tavily API platform, sign in or create an account, then go to the "API Keys" section in your account settings to generate a new API key.
- MySQL 8.1
- Langsmith account (recommended)
- Langgraph-studio (recommended). LangGraph Studio requires docker-compose version 2.22.0+ or higher.
python3 -m ensurepip --default-pip
pip install poetry
# Pull the MySQL 8.1 Docker image:
docker pull mysql:8.1
# Run a MySQL container:
docker run --name mysql-container -e MYSQL_ROOT_PASSWORD=rootpassword -d mysql:8.1
# Create a new database:
CREATE DATABASE your_database_name;
# Execute the SQL script from the install/mysql.sql file to create the required table(s) by running:
source /path/to/install/mysql.sql
# This is a private repo. Contact [email protected] for GITHUB TOKEN
gh repo clone agithomas/obs-integrations-fields-insights
[ec2-user@ip-172-31-43-26 obs-integrations-fields-insights]$ cd obs-integrations-fields-insights/
[ec2-user@ip-172-31-43-26 obs-integrations-fields-insights]$ poetry shell
(asset-preset-config-generator-py3.11) [ec2-user@ip-172-31-43-26 obs-integrations-fields-insights]$ poetry install
Updating dependencies
Resolving dependencies... (10.7s)
Package operations: 89 installs, 0 updates, 0 removals
- Installing certifi (2024.12.14)
........
- Installing neo4j (5.27.0)
- Installing pytest (8.3.4)
- Installing tavily-python (0.5.0)
Writing lock file
Installing the current project: asset-preset-config-generator (0.1.0)
cp example.env .env
Field Name | Description |
---|---|
ES_INDEX |
The vector store index name. |
ES_INDEX_RELEVANCE |
The name of the index for storing the output of the framework run. |
LANGCHAIN_TRACING_V2 |
Set to "true" to enable langchain tracing. |
langgraph up
asset-preset-config-generator-py3.11(base) agikthomas@Agis-MBP asset-preset-config-generator % langgraph up
Starting LangGraph API server...
For local dev, requires env var LANGSMITH_API_KEY with access to LangGraph Cloud closed beta.
For production use, requires a license key in env var LANGGRAPH_CLOUD_LICENSE_KEY.
Ready!
- API: http://localhost:8123
- Docs: http://localhost:8123/docs
- LangGraph Studio: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:8123
Use the https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:8123
to access the langgraph studio's Web UI for testing.
Elastic search connectors offers Elastic connectors to sync the data from a variety of sources. The data stored in the MySQL instance created as part of the framework run can be sync-ed up with Elasticsearch using the ES Connector for MySQL.
-
Clone the repo : https://github.com/elastic/connectors
-
Login to Kibana, Go to Search -> Content -> Connectors.
- Configure a new connector.
- Mention the name of an index. This is the index that will have the data of the sync-ed data from MySQL instance.
- Example :
obs_assistant_asset_preset_mysql
- Example :
- Provide the connectivity and authentication details.
-
Update the config.yml as
connectors: - connector_id: "<connector-id>" service_type: "mysql" elasticsearch: host: "<elasticsearch endpoint>" api_key: "<elastisearch api key>"
-
cd connectors && make run
-
Configure AI Assistant Settings (Optional)
- Go to Kibana -> Observability -> Overview
- Select AI Assistant Settings
- Enter the name of the Search connector index pattern.
- Example :
obs_assistant_asset_preset_mysql
- Example :
-
Follow the steps to connect AI assistant with search connector and perform the sync operation.

'NewConnectionError('<pip._vendor.urllib3.connection.HTTPSConnection object at 0x7f2c51c0a610>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /simple/poetry-core/
Resolution
sudo nano /etc/docker/daemon.json
Add the following entries
{
"dns": ["8.8.8.8", "8.8.4.4"]
}
sudo systemctl restart docker
If you're using Docker Desktop (on macOS or Windows) and the container is running on host networking mode, you can use host.docker.internal
. However, this doesn't work in Linux or EC2 environments. Make sure that the mysql instance is accessbible from the langgraph docker image created when running langgraph up
Use the verbose option to debug the issue
langgraph up --verbose