Skip to content

Commit

Permalink
Translated file updates
Browse files Browse the repository at this point in the history
  • Loading branch information
guacbot committed Jan 14, 2025
1 parent f82a84f commit 5a36835
Show file tree
Hide file tree
Showing 11 changed files with 1,603 additions and 11 deletions.
14 changes: 7 additions & 7 deletions content/es/database_monitoring/setup_mongodb/mongodbatlas.md
Original file line number Diff line number Diff line change
Expand Up @@ -212,24 +212,24 @@ Puede utilizar uno de estos nombres hostpara Configurar el Agent.

{{< tabs >}}
{{% tab "Replica Set" %}}
{{% dbm-mongodb-agent-config-replica-set %}}
{{% dbm-mongodb-Agent-config-replica-set %}}
{{% /tab %}}
{{% tab "Sharded clúster" %}}
{{% dbm-mongodb-agent-config-sharded-cluster %}}
{{% dbm-mongodb-Agent-config-sharded-clúster %}}
{{% /tab %}}
{{< /tabs >}}

#### Configurar el Agent

{{< tabs >}}
{{% tab "Linux host" %}}
{{% dbm-mongodb-agent-setup-linux %}}
{{% dbm-mongodb-Agent-setup-linux %}}
{{% /tab %}}
{{% tab "Docker" %}}
{{% dbm-mongodb-agent-setup-docker %}}
{{% dbm-mongodb-Agent-setup-Docker %}}
{{% /tab %}}
{{% tab "Kubernetes" %}}
{{% dbm-mongodb-agent-setup-kubernetes %}}
{{% dbm-mongodb-Agent-setup-Kubernetes %}}
{{% /tab %}}
{{< /tabs >}}

Expand All @@ -245,9 +245,9 @@ Para ver una lista completa de las métricas recopiladas por la integración Mon

{{% dbm-mongodb-agent-data-collected %}}

## Configurar tests de API y tests de API multupaso
## Para leer más

Grabar pruebas de aplicaciones móviles
{{< partial name="whats-next/whats-next.html" >}}

[1]: /es/database_monitoring/architecture/#cloud-managed-databases
[2]: /es/account_management/api-app-keys/
Expand Down
4 changes: 2 additions & 2 deletions content/es/getting_started/cloud_siem/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -114,9 +114,9 @@ Después de mover el índice a la primera posición, revisa la configuración y
1. Configura [reglas de supresión][23] para reducir el ruido.
2. Crea [reglas de detección personalizadas][24]. Lee las [prácticas recomendadas para crear reglas de detección][25].

## Configurar tests de API y tests de API multupaso
## Para leer más

Grabar pruebas de aplicaciones móviles
{{< partial name="whats-next/whats-next.html" >}}

[1]: /es/security/cloud_siem/
[2]: https://app.datadoghq.com/security/configuration/siem/log-sources
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,334 @@
---
aliases:
- /es/observability_pipelines/set_up_pipelines/archive_logs/google_pub_sub/
disable_toc: false
title: Archivar logs para Google Pub/Sub
---

## Información general

Configura Google Pub/Sub para que el worker de Observability Pipelines formatee los logs recopilados en un formato que Datadog pueda volver a usar antes de enrutarlos a los archivos de log de Datadog.

{{% observability_pipelines/use_case_images/archive_logs %}}

Este documento te guiará a través de los siguientes pasos:
1. Los [requisitos previos](#prerequisites) necesarios para configurar Observability Pipelines
1. [Configuración de un archivo de log](#configure-a-log-archive)
1. [Configuración de Observability Pipelines](#set-up-observability-pipelines)

## Requisitos previos

{{% observability_pipelines/prerequisites/google_pubsub %}}

## Configurar los archivos de log

Si ya tienes un archivo de log de Datadog configurado para Observability Pipelines, pasa a [Configurar Observability Pipelines](#set-up-observability-pipelines).

A fin de configurar el archivo de log de Datadog, debes tener instalada la integración de Datadog para tu proveedor de nube. Consulta la documentación de la [integración de AWS][1], [Google Cloud Platform][2] y la [integración de Azure][3] para obtener más información.

Selecciona el proveedor de nube que usas para archivar tus logs.

{{% collapse-content title="Amazon S3" level="h4" %}}
{{% observability_pipelines/configure_log_archive/amazon_s3/instructions %}}

{{< tabs >}}
{{% tab "Docker" %}}

{{% observability_pipelines/configure_log_archive/amazon_s3/docker %}}

{{% /tab %}}
{{% tab "Amazon EKS" %}}

{{% observability_pipelines/configure_log_archive/amazon_s3/amazon_eks %}}

{{% /tab %}}
{{% tab "Linux (APT)" %}}

{{% observability_pipelines/configure_log_archive/amazon_s3/linux_apt %}}

{{% /tab %}}
{{% tab "Linux (RPM)" %}}

{{% observability_pipelines/configure_log_archive/amazon_s3/linux_rpm %}}

{{% /tab %}}
{{< /tabs >}}

{{% observability_pipelines/configure_log_archive/amazon_s3/connect_s3_to_datadog_log_archives %}}

{{% /collapse-content %}}

{{% collapse-content title="Google Cloud Storage" level="h4" %}}

{{% observability_pipelines/configure_log_archive/google_cloud_storage/instructions %}}

{{% /collapse-content %}}
{{% collapse-content title="Azure Storage" level="h4" %}}

{{% observability_pipelines/configure_log_archive/azure_storage/instructions %}}

{{% /collapse-content %}}

## Configurar Observability Pipelines

1. Ve a [Observability Pipelines][1].
1. Selecciona la plantilla **Archive Logs** (Archivar logs) para crear un pipeline nuevo.
1. Selecciona **Google Pub/Sub** como fuente.

### Configurar la fuente

{{% observability_pipelines/source_settings/google_pubsub %}}

### Configurar los destinos

Ingresa la siguiente información según el destino de logs seleccionado.

{{< tabs >}}
{{% tab "Archivos de Datadog" %}}

{{% observability_pipelines/destination_settings/datadog_archives_note %}}

Sigue las instrucciones del proveedor de nube que usas para archivar tus logs.

{{% collapse-content title="Amazon S3" level="h5" %}}

{{% observability_pipelines/destination_settings/datadog_archives_amazon_s3 %}}

{{% /collapse-content %}}
{{% collapse-content title="Google Cloud Storage" level="h5" %}}

{{% observability_pipelines/destination_settings/datadog_archives_google_cloud_storage %}}

{{% /collapse-content %}}
{{% collapse-content title="Azure Storage" level="h5" %}}

{{% observability_pipelines/destination_settings/datadog_archives_azure_storage %}}

{{% /collapse-content %}}

{{% /tab %}}
{{% tab "Datadog" %}}

{{% observability_pipelines/destination_settings/datadog %}}

{{% /tab %}}
{{% tab "Splunk HEC" %}}

{{% observability_pipelines/destination_settings/splunk_hec %}}

{{% /tab %}}
{{% tab "Sumo Logic" %}}

{{% observability_pipelines/destination_settings/sumo_logic %}}

{{% /tab %}}
{{% tab "Syslog" %}}

{{% observability_pipelines/destination_settings/syslog %}}

{{% /tab %}}
{{% tab "Chronicle" %}}

{{% observability_pipelines/destination_settings/chronicle %}}

{{% /tab %}}
{{% tab "Elasticsearch" %}}

{{% observability_pipelines/destination_settings/elasticsearch %}}

{{% /tab %}}
{{% tab "OpenSearch" %}}

{{% observability_pipelines/destination_settings/opensearch %}}

{{% /tab %}}
{{% tab "Amazon OpenSearch" %}}

{{% observability_pipelines/destination_settings/amazon_opensearch %}}

{{% /tab %}}
{{% tab "New Relic" %}}

{{% observability_pipelines/destination_settings/new_relic %}}

{{% /tab %}}
{{< /tabs >}}

### Configurar procesadores

{{% observability_pipelines/processors/intro %}}

{{% observability_pipelines/processors/filter_syntax %}}

{{% observability_pipelines/processors/add_processors %}}

{{< tabs >}}
{{% tab "Filtro" %}}

{{% observability_pipelines/processors/filter %}}

{{% /tab %}}
{{% tab "Editar campos" %}}

{{% observability_pipelines/processors/remap %}}

{{% /tab %}}
{{% tab "Muestra" %}}

{{% observability_pipelines/processors/sample %}}

{{% /tab %}}
{{% tab "Analizador de Grok" %}}

{{% observability_pipelines/processors/grok_parser %}}

{{% /tab %}}
{{% tab "Cuota" %}}

{{% observability_pipelines/processors/quota %}}

{{% /tab %}}
{{% tab "Reducir" %}}

{{% observability_pipelines/processors/reduce %}}

{{% /tab %}}
{{% tab "Deduplicar" %}}

{{% observability_pipelines/processors/dedupe %}}

{{% /tab %}}
{{% tab "Sensitive Data Scanner" %}}

{{% observability_pipelines/processors/sensitive_data_scanner %}}

{{% /tab %}}
{{% tab "Añadir nombre de host" %}}

{{% observability_pipelines/processors/add_hostname %}}

{{% /tab %}}
{{% tab "Analizar JSON" %}}

{{% observability_pipelines/processors/parse_json %}}

{{% /tab %}}
{{% tab "Tabla de enriquecimiento" %}}

{{% observability_pipelines/processors/enrichment_table %}}

{{% /tab %}}
{{% tab "Generar métricas" %}}

{{% observability_pipelines/processors/generate_metrics %}}

{{% /tab %}}
{{% tab "Añadir variables de entorno" %}}

{{% observability_pipelines/processors/add_env_vars %}}

{{% /tab %}}
{{< /tabs >}}

### Instalar el worker de Observability Pipelines
1. Selecciona tu plataforma en el menú desplegable **Choose your installation platform** (Elige tu plataforma de instalación).
1. Proporciona las variables de entorno para cada uno de los destinos seleccionados. Consulta los [requisitos previos](#prerequisites) para obtener más información.
{{< tabs >}}
{{% tab "Archivos de Datadog" %}}

{{% collapse-content title="Amazon S3" level="h5" %}}

{{% observability_pipelines/destination_env_vars/datadog_archives_amazon_s3 %}}

{{% /collapse-content %}}
{{% collapse-content title="Google Cloud Storage" level="h5" %}}

{{% observability_pipelines/destination_env_vars/datadog_archives_google_cloud_storage %}}

{{% /collapse-content %}}
{{% collapse-content title="Azure Storage" level="h5" %}}

{{% observability_pipelines/destination_env_vars/datadog_archives_azure_storage %}}

{{% /collapse-content %}}

{{% /tab %}}
{{% tab "Datadog" %}}

{{% observability_pipelines/destination_env_vars/datadog %}}

{{% /tab %}}
{{% tab "Splunk HEC" %}}

{{% observability_pipelines/destination_env_vars/splunk_hec %}}

{{% /tab %}}
{{% tab "Sumo Logic" %}}

{{% observability_pipelines/destination_env_vars/sumo_logic %}}

{{% /tab %}}
{{% tab "Syslog" %}}

{{% observability_pipelines/destination_env_vars/syslog %}}

{{% /tab %}}
{{% tab "Chronicle" %}}

{{% observability_pipelines/destination_env_vars/chronicle %}}

{{% /tab %}}
{{% tab "Elasticsearch" %}}

{{% observability_pipelines/destination_env_vars/elasticsearch %}}

{{% /tab %}}
{{% tab "OpenSearch" %}}

{{% observability_pipelines/destination_env_vars/opensearch %}}

{{% /tab %}}
{{% tab "Amazon OpenSearch" %}}

{{% observability_pipelines/destination_env_vars/amazon_opensearch %}}

{{% /tab %}}
{{% tab "New Relic" %}}

{{% observability_pipelines/destination_env_vars/new_relic %}}

{{% /tab %}}
{{< /tabs >}}
1. Sigue las instrucciones de tu entorno para instalar el worker.
{{< tabs >}}
{{% tab "Docker" %}}

{{% observability_pipelines/install_worker/docker %}}

{{% /tab %}}
{{% tab "Kubernetes" %}}

{{% observability_pipelines/install_worker/kubernetes %}}

{{% /tab %}}
{{% tab "Linux (APT)" %}}

{{% observability_pipelines/install_worker/linux_apt %}}

{{% /tab %}}
{{% tab "Linux (RPM)" %}}

{{% observability_pipelines/install_worker/linux_rpm %}}

{{% /tab %}}
{{% tab "CloudFormation" %}}

{{% observability_pipelines/install_worker/cloudformation %}}

{{% /tab %}}
{{< /tabs >}}

[1]: /es/integrations/amazon_web_services/#setup
[2]: /es/integrations/google_cloud_platform/#setup
[3]: /es/integrations/azure/#setup
[4]: https://app.datadoghq.com/observability-pipelines
Loading

0 comments on commit 5a36835

Please sign in to comment.