-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
11 changed files
with
1,603 additions
and
11 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
334 changes: 334 additions & 0 deletions
334
content/es/observability_pipelines/set_up_pipelines/archive_logs/google_pubsub.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,334 @@ | ||
--- | ||
aliases: | ||
- /es/observability_pipelines/set_up_pipelines/archive_logs/google_pub_sub/ | ||
disable_toc: false | ||
title: Archivar logs para Google Pub/Sub | ||
--- | ||
|
||
## Información general | ||
|
||
Configura Google Pub/Sub para que el worker de Observability Pipelines formatee los logs recopilados en un formato que Datadog pueda volver a usar antes de enrutarlos a los archivos de log de Datadog. | ||
|
||
{{% observability_pipelines/use_case_images/archive_logs %}} | ||
|
||
Este documento te guiará a través de los siguientes pasos: | ||
1. Los [requisitos previos](#prerequisites) necesarios para configurar Observability Pipelines | ||
1. [Configuración de un archivo de log](#configure-a-log-archive) | ||
1. [Configuración de Observability Pipelines](#set-up-observability-pipelines) | ||
|
||
## Requisitos previos | ||
|
||
{{% observability_pipelines/prerequisites/google_pubsub %}} | ||
|
||
## Configurar los archivos de log | ||
|
||
Si ya tienes un archivo de log de Datadog configurado para Observability Pipelines, pasa a [Configurar Observability Pipelines](#set-up-observability-pipelines). | ||
|
||
A fin de configurar el archivo de log de Datadog, debes tener instalada la integración de Datadog para tu proveedor de nube. Consulta la documentación de la [integración de AWS][1], [Google Cloud Platform][2] y la [integración de Azure][3] para obtener más información. | ||
|
||
Selecciona el proveedor de nube que usas para archivar tus logs. | ||
|
||
{{% collapse-content title="Amazon S3" level="h4" %}} | ||
{{% observability_pipelines/configure_log_archive/amazon_s3/instructions %}} | ||
|
||
{{< tabs >}} | ||
{{% tab "Docker" %}} | ||
|
||
{{% observability_pipelines/configure_log_archive/amazon_s3/docker %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Amazon EKS" %}} | ||
|
||
{{% observability_pipelines/configure_log_archive/amazon_s3/amazon_eks %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Linux (APT)" %}} | ||
|
||
{{% observability_pipelines/configure_log_archive/amazon_s3/linux_apt %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Linux (RPM)" %}} | ||
|
||
{{% observability_pipelines/configure_log_archive/amazon_s3/linux_rpm %}} | ||
|
||
{{% /tab %}} | ||
{{< /tabs >}} | ||
|
||
{{% observability_pipelines/configure_log_archive/amazon_s3/connect_s3_to_datadog_log_archives %}} | ||
|
||
{{% /collapse-content %}} | ||
|
||
{{% collapse-content title="Google Cloud Storage" level="h4" %}} | ||
|
||
{{% observability_pipelines/configure_log_archive/google_cloud_storage/instructions %}} | ||
|
||
{{% /collapse-content %}} | ||
{{% collapse-content title="Azure Storage" level="h4" %}} | ||
|
||
{{% observability_pipelines/configure_log_archive/azure_storage/instructions %}} | ||
|
||
{{% /collapse-content %}} | ||
|
||
## Configurar Observability Pipelines | ||
|
||
1. Ve a [Observability Pipelines][1]. | ||
1. Selecciona la plantilla **Archive Logs** (Archivar logs) para crear un pipeline nuevo. | ||
1. Selecciona **Google Pub/Sub** como fuente. | ||
|
||
### Configurar la fuente | ||
|
||
{{% observability_pipelines/source_settings/google_pubsub %}} | ||
|
||
### Configurar los destinos | ||
|
||
Ingresa la siguiente información según el destino de logs seleccionado. | ||
|
||
{{< tabs >}} | ||
{{% tab "Archivos de Datadog" %}} | ||
|
||
{{% observability_pipelines/destination_settings/datadog_archives_note %}} | ||
|
||
Sigue las instrucciones del proveedor de nube que usas para archivar tus logs. | ||
|
||
{{% collapse-content title="Amazon S3" level="h5" %}} | ||
|
||
{{% observability_pipelines/destination_settings/datadog_archives_amazon_s3 %}} | ||
|
||
{{% /collapse-content %}} | ||
{{% collapse-content title="Google Cloud Storage" level="h5" %}} | ||
|
||
{{% observability_pipelines/destination_settings/datadog_archives_google_cloud_storage %}} | ||
|
||
{{% /collapse-content %}} | ||
{{% collapse-content title="Azure Storage" level="h5" %}} | ||
|
||
{{% observability_pipelines/destination_settings/datadog_archives_azure_storage %}} | ||
|
||
{{% /collapse-content %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Datadog" %}} | ||
|
||
{{% observability_pipelines/destination_settings/datadog %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Splunk HEC" %}} | ||
|
||
{{% observability_pipelines/destination_settings/splunk_hec %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Sumo Logic" %}} | ||
|
||
{{% observability_pipelines/destination_settings/sumo_logic %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Syslog" %}} | ||
|
||
{{% observability_pipelines/destination_settings/syslog %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Chronicle" %}} | ||
|
||
{{% observability_pipelines/destination_settings/chronicle %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Elasticsearch" %}} | ||
|
||
{{% observability_pipelines/destination_settings/elasticsearch %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "OpenSearch" %}} | ||
|
||
{{% observability_pipelines/destination_settings/opensearch %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Amazon OpenSearch" %}} | ||
|
||
{{% observability_pipelines/destination_settings/amazon_opensearch %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "New Relic" %}} | ||
|
||
{{% observability_pipelines/destination_settings/new_relic %}} | ||
|
||
{{% /tab %}} | ||
{{< /tabs >}} | ||
|
||
### Configurar procesadores | ||
|
||
{{% observability_pipelines/processors/intro %}} | ||
|
||
{{% observability_pipelines/processors/filter_syntax %}} | ||
|
||
{{% observability_pipelines/processors/add_processors %}} | ||
|
||
{{< tabs >}} | ||
{{% tab "Filtro" %}} | ||
|
||
{{% observability_pipelines/processors/filter %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Editar campos" %}} | ||
|
||
{{% observability_pipelines/processors/remap %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Muestra" %}} | ||
|
||
{{% observability_pipelines/processors/sample %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Analizador de Grok" %}} | ||
|
||
{{% observability_pipelines/processors/grok_parser %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Cuota" %}} | ||
|
||
{{% observability_pipelines/processors/quota %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Reducir" %}} | ||
|
||
{{% observability_pipelines/processors/reduce %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Deduplicar" %}} | ||
|
||
{{% observability_pipelines/processors/dedupe %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Sensitive Data Scanner" %}} | ||
|
||
{{% observability_pipelines/processors/sensitive_data_scanner %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Añadir nombre de host" %}} | ||
|
||
{{% observability_pipelines/processors/add_hostname %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Analizar JSON" %}} | ||
|
||
{{% observability_pipelines/processors/parse_json %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Tabla de enriquecimiento" %}} | ||
|
||
{{% observability_pipelines/processors/enrichment_table %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Generar métricas" %}} | ||
|
||
{{% observability_pipelines/processors/generate_metrics %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Añadir variables de entorno" %}} | ||
|
||
{{% observability_pipelines/processors/add_env_vars %}} | ||
|
||
{{% /tab %}} | ||
{{< /tabs >}} | ||
|
||
### Instalar el worker de Observability Pipelines | ||
1. Selecciona tu plataforma en el menú desplegable **Choose your installation platform** (Elige tu plataforma de instalación). | ||
1. Proporciona las variables de entorno para cada uno de los destinos seleccionados. Consulta los [requisitos previos](#prerequisites) para obtener más información. | ||
{{< tabs >}} | ||
{{% tab "Archivos de Datadog" %}} | ||
|
||
{{% collapse-content title="Amazon S3" level="h5" %}} | ||
|
||
{{% observability_pipelines/destination_env_vars/datadog_archives_amazon_s3 %}} | ||
|
||
{{% /collapse-content %}} | ||
{{% collapse-content title="Google Cloud Storage" level="h5" %}} | ||
|
||
{{% observability_pipelines/destination_env_vars/datadog_archives_google_cloud_storage %}} | ||
|
||
{{% /collapse-content %}} | ||
{{% collapse-content title="Azure Storage" level="h5" %}} | ||
|
||
{{% observability_pipelines/destination_env_vars/datadog_archives_azure_storage %}} | ||
|
||
{{% /collapse-content %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Datadog" %}} | ||
|
||
{{% observability_pipelines/destination_env_vars/datadog %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Splunk HEC" %}} | ||
|
||
{{% observability_pipelines/destination_env_vars/splunk_hec %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Sumo Logic" %}} | ||
|
||
{{% observability_pipelines/destination_env_vars/sumo_logic %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Syslog" %}} | ||
|
||
{{% observability_pipelines/destination_env_vars/syslog %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Chronicle" %}} | ||
|
||
{{% observability_pipelines/destination_env_vars/chronicle %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Elasticsearch" %}} | ||
|
||
{{% observability_pipelines/destination_env_vars/elasticsearch %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "OpenSearch" %}} | ||
|
||
{{% observability_pipelines/destination_env_vars/opensearch %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Amazon OpenSearch" %}} | ||
|
||
{{% observability_pipelines/destination_env_vars/amazon_opensearch %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "New Relic" %}} | ||
|
||
{{% observability_pipelines/destination_env_vars/new_relic %}} | ||
|
||
{{% /tab %}} | ||
{{< /tabs >}} | ||
1. Sigue las instrucciones de tu entorno para instalar el worker. | ||
{{< tabs >}} | ||
{{% tab "Docker" %}} | ||
|
||
{{% observability_pipelines/install_worker/docker %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Kubernetes" %}} | ||
|
||
{{% observability_pipelines/install_worker/kubernetes %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Linux (APT)" %}} | ||
|
||
{{% observability_pipelines/install_worker/linux_apt %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "Linux (RPM)" %}} | ||
|
||
{{% observability_pipelines/install_worker/linux_rpm %}} | ||
|
||
{{% /tab %}} | ||
{{% tab "CloudFormation" %}} | ||
|
||
{{% observability_pipelines/install_worker/cloudformation %}} | ||
|
||
{{% /tab %}} | ||
{{< /tabs >}} | ||
|
||
[1]: /es/integrations/amazon_web_services/#setup | ||
[2]: /es/integrations/google_cloud_platform/#setup | ||
[3]: /es/integrations/azure/#setup | ||
[4]: https://app.datadoghq.com/observability-pipelines |
Oops, something went wrong.