Skip to content

Commit

Permalink
add migrated files (#12905)
Browse files Browse the repository at this point in the history
  • Loading branch information
colleenmcginnis authored Feb 27, 2025
1 parent fc127eb commit 10a6d56
Show file tree
Hide file tree
Showing 57 changed files with 5,679 additions and 0 deletions.
508 changes: 508 additions & 0 deletions docs/docset.yml

Large diffs are not rendered by default.

37 changes: 37 additions & 0 deletions docs/extend/_publish_an_integration.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
---
mapped_pages:
- https://www.elastic.co/guide/en/integrations-developer/current/_publish_an_integration.html
---

# Publish an integration [_publish_an_integration]

When your integration is done, it’s time to open a PR to include it in the integrations repository. Before opening your PR, run:

```bash
elastic-package check
```

The `check` command ensures the package is built correctly, formatted properly, and aligned with the spec. Passing the `check` command is required before adding your integration to the repository.

When CI is happy, merge your PR into the integrations repository.

CI will kick off a build job for the main branch, which can release your integration to the package-storage. It means that it will open a PR to the Package Storage/snapshot with the built integration if only the package version doesn’t already exist in the storage (hasn’t been released yet).


## Promote [_promote]

Now that you’ve tested your integration with {{kib}}, it’s time to promote it to staging or production. Run:

```bash
elastic-package promote
```

The tool will open 2 pull requests (promote and delete) to the package-storage: target and source branches.

Please review both pull requests on your own, check if CI is happy and merge - first target, then source. Once any PR is merged, the CI will kick off a job to bake a new Docker image of package-storage (tracking). Ideally the "delete" PR should be merged once the CI job for "promote" is done, as the Docker image of previous stage depends on the later one.

::::{tip}
When you are ready for your changes in the integration to be released, remember to bump up the package version. It is up to you, as the package developer, to decide how many changes you want to release in a single version. For example, you could implement a change in a PR and bump up the package version in the same PR. Or you could implement several changes across multiple pull requests and then bump up the package version in the last of these pull requests or in a separate follow up PR.
::::


43 changes: 43 additions & 0 deletions docs/extend/add-data-stream.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
---
mapped_pages:
- https://www.elastic.co/guide/en/integrations-developer/current/add-a-data-stream.html
---

# Add a data stream [add-a-data-stream]

A data stream is a logical sub-division of an integration package, dealing with a specific observable aspect of the service or product being observed. For example, the [Apache integration](https://github.com/elastic/integrations/tree/main/packages/apache) has three data streams, each represented by a separate folder of assets in the `data_stream` directory:

```text
apache
└───data_stream
│ └───access
│ └───error
│ └───status
```

::::{admonition}
**Data streams** allow you to store time series data across multiple indices while giving you a single named resource for requests.

A data stream defines multiple {{es}} assets, like index templates, ingest pipelines, and field definitions. These assets are loaded into {{es}} when a user installs an integration using the {{fleet}} UI in {{kib}}.

A data stream also defines a policy template. Policy templates include variables that allow users to configure the data stream using the {{fleet}} UI in {{kib}}. Then, the {{agent}} interprets the resulting policy to collect relevant information from the product or service being observed. Policy templates can also define an integration’s supported [`deployment_modes`](/extend/define-deployment-modes.md#deployment_modes).

See [data streams](docs-content://reference/ingestion-tools/fleet/data-streams.md) for more information.

::::


Bootstrap a new data stream using the TUI wizard. In the directory of your package, run:

```bash
elastic-package create data-stream
```

Follow the prompts to name, title, and select your data stream type. Then, run this command each time you add a new data stream to your integration.

Next, manually adjust the data stream:

* define required variables
* define used fields
* define ingest pipeline definitions (if necessary)
* update the {{agent}}'s stream configuration
127 changes: 127 additions & 0 deletions docs/extend/add-mapping.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,127 @@
---
mapped_pages:
- https://www.elastic.co/guide/en/integrations-developer/current/add-a-mapping.html
---

# Edit field mappings [add-a-mapping]

Ingest pipelines create fields in an {{es}} index, but don’t define the fields themselves. Instead, each field requires a defined data type or mapping.

::::{admonition}
**Mapping** is the process of defining how a document, and the fields it contains, are stored and indexed. Each document is a collection of fields, each having its own data type. When mapping your data, create a mapping definition containing a list of fields pertinent to the document. A mapping definition also includes metadata fields, like the _source field, which customize how the associated metadata of a document is handled.

To learn more, see [mapping](docs-content://manage-data/data-store/mapping.md).

::::


In the integration, the `fields` directory serves as the blueprint used to create component templates for the integration. The content from all files in this directory will be unified when the integration is built, so the mappings need to be unique per data stream dataset.

Like ingest pipelines, mappings only apply to the data stream dataset, for our example the `apache.access` dataset.

+ NOTE: The names of these files are conventions, any file name with a `.yml` extension will work.

Integrations have had significant enhancements in how ECS fields are defined. Below is a guide on which approach to use, based on the version of Elastic your integration will support.

+ . ECS mappings component template (>=8.13.0) Integrations **only** supporting version 8.13.0 and up, can use the [ecs@mappings](https://github.com/elastic/elasticsearch/blob/c2a3ec42632b0339387121efdef13f52c6c66848/x-pack/plugin/core/template-resources/src/main/resources/ecs%40mappings.json) component template installed by Fleet. This makes explicitly declaring ECS fields unnecessary; the `ecs@mappings` component template in Elasticsearch will automatically detect and configure them. However, should ECS fields be explicitly defined, they will overwrite the dynamic mapping provided by the `ecs@mappings` component template. They can also be imported with an `external` declaration, as seen in the example below.

+ . Dynamic mappings imports (<8.13.0 & >=8.13.0) Integrations supporting the Elastic stack below version 8.13.0 can still dynamically import ECS field mappings by defining `import_mappings: true` in the ECS section of the `_dev/build/build.yml` file in the root of the package directory. This introduces a [dynamic mapping](https://github.com/elastic/elastic-package/blob/f439b96a74c27c5adfc3e7810ad584204bfaf85d/internal/builder/_static/ecs_mappings.yaml) with most of the ECS definitions. Using this method means that, just like the previous approach, ECS fields don’t need to be defined in your integration, they are dynamically integrated into the package at build time. Explicitly defined ECS fields can be used and will also overwrite this mechanism.

An example of the aformentioned `build.yml` file for this method:

+

```yaml
dependencies:
ecs:
reference: [email protected]
import_mappings: true
```
+ . Explicit ECS mappings As mentioned in the previous two approaches, ECS mappings can still be set explicitly and will overwrite the dynamic mappings. This can be done in two ways: - Using an `external: ecs` reference to import the definition of a specific field. - Literally defining the ECS field.

The `external: ecs` definition instructs the `elastic-package` command line tool to refer to an external ECS reference to resolve specific fields. By default it looks at the [ECS reference](https://raw.githubusercontent.com/elastic/ecs/v8.6.0/generated/ecs/ecs_nested.yml) file hosted on Github. This external reference file is determined by a Git reference found in the `_dev/build/build.yml` file, in the root of the package directory. The `build.yml` file set up for external references:

+

```yaml
dependencies:
ecs:
reference: [email protected]
```

Literal definition a ECS field:

```yaml
- name: cloud.acount.id
level: extended
type: keyword
ignore_above: 1024
description: 'The cloud account or organ....'
example: 43434343
```

1. Local ECS reference file (air-gapped setup) By changing the Git reference in in `_dev/build/build.yml` to the path of the downloaded [ECS reference](https://raw.githubusercontent.com/elastic/ecs/v8.6.0/generated/ecs/ecs_nested.yml) file, it is possible for the `elastic-package` command line tool to look for this file locally. Note that the path should be the full path to the reference file. Doing this, our `build.yml` file looks like:

```
dependencies:
ecs:
reference: file:///home/user/integrations/packages/apache/ecs_nested.yml
```


The `access` data stream dataset of the Apache integration has four different field definitions:

+ NOTE: The `apache` integration below has not yet been updated to use the dynamic ECS field definition and uses `external` references to define ECS fields in `ecs.yml`.

+

```text
apache
└───data_stream
│ └───access
│ │ └───elasticsearch/ingest_pipeline
│ │ │ default.yml
│ │ └───fields
│ │ agent.yml
│ │ base-fields.yml
│ │ ecs.yml
│ │ fields.yml
│ └───error
│ │ └───elasticsearch/ingest_pipeline
│ │ │ default.yml
│ │ └───fields
│ │ agent.yml
│ │ base-fields.yml
│ │ ecs.yml
│ │ fields.yml
│ └───status
```

## agent.yml [_agent_yml]

The `agent.yml` file defines fields used by default processors. Examples: `cloud.account.id`, `container.id`, `input.type`


## base-fields.yml [_base_fields_yml]

In this file, the `data_stream` subfields `type`, `dataset` and `namespace` are defined as type `constant_keyword`, the values for these fields are added by the integration. The `event.module` and `event.dataset` fields are defined with a fixed value specific for this integration: - `event.module: apache` - `event.dataset: apache.access` Field `@timestamp` is defined here as type `date`.


## fields.yml [_fields_yml]

Here we define fields that we need in our integration and are not found in the ECS. The example below defines field `apache.access.ssl.protocol` in the Apache integration.

+

```yaml
- name: apache.access
type: group
fields:
- name: ssl.protocol
type: keyword
description: |
SSL protocol version.
```

Learn more about fields in the [general guidelines](/extend/general-guidelines.md#_document_all_fields).
64 changes: 64 additions & 0 deletions docs/extend/asset-testing.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
---
mapped_pages:
- https://www.elastic.co/guide/en/integrations-developer/current/asset-testing.html
---

# Asset testing [asset-testing]

Elastic Packages define assets to be loaded into {{es}} and {{kib}}. Asset loading tests exercise install a package to ensure that its assets are loaded into {{es}} and {{kib}} as expected.


## Conceptual process [asset-testing-concepts]

Conceptually, running an asset load test involves the following steps:

1. Build the package.
2. Deploy {{es}}, {{kib}}, and the {{package-registry}} (all of which are part of the {{stack}}). This step takes time, so you should typically do it once as a prerequisite to running asset loading tests on multiple packages.
3. Install the package.
4. Use various {{kib}} and {{es}} APIs to confirm that the package assets were loaded into {{kib}} and {{es}} as expected.
5. Remove the package.


## Define an asset loading test [define-asset-test]

As a package developer, there is no work required to define an asset loading test for your package. All the necessary information is contained in the package files.


## Run an asset loading test [running-asset-test]

First, you must build your package. This step corresponds to step 1 in the [Conceptual process](#asset-testing-concepts) section.

Navigate to the root folder of the package, or any sub-folder under it, and run the following command.

```bash
elastic-package build
```

Next, deploy {{es}}, {{kib}}, and the {{package-registry}}. This step corresponds to step 2 in the [Conceptual process](#asset-testing-concepts) section.

```bash
elastic-package stack up -d
```

To view a list of the available options for this command, run `elastic-package stack up -h` or `elastic-package help stack up`.

Next, set the environment variables that are required for additional `elastic-package` commands.

```bash
$(elastic-package stack shellinit)
```

Next, invoke the asset loading test runner. This step corresponds to steps 3 to 5 in the [Conceptual process](#asset-testing-concepts) section.

Navigate to the root folder of the package, or any sub-folder under it, and run the following command.

```bash
elastic-package test asset
```

Finally, when all the asset loading tests have completed, bring down the {{stack}}. This step corresponds to step 4 in the [Conceptual process](#asset-testing-concepts) section.

```bash
elastic-package stack down
```

23 changes: 23 additions & 0 deletions docs/extend/build-create-package.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
---
mapped_pages:
- https://www.elastic.co/guide/en/integrations-developer/current/build-create-package.html
---

# Create a new package [build-create-package]

Rather than copying the source of an existing package, we recommend using the `elastic-package create` command to build a new package. Running this command ensures that your integration follows the latest recommendations for the package format.

Use the `elastic-package` TUI wizard to bootstrap a new package:

```bash
elastic-package create package
```

The wizard walks you through the creation of the package, including setting a package name, version, category, etc. When the wizard completes, you’ll have a basic package complete with a sample manifest, changelog, documentation, and screenshot.

::::{note}
It may not do anything yet, but your integration can be built and loaded into your locally running package registry from this step forward. Jump to [Build](/extend/build-it.md) at any point in this documentation to take your integration for a test run.

::::


25 changes: 25 additions & 0 deletions docs/extend/build-it.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
---
mapped_pages:
- https://www.elastic.co/guide/en/integrations-developer/current/build-it.html
---

# Build [build-it]

To format, lint, and build your integration, in that order, run:

```bash
elastic-package check
```

Problems and potential solutions will display in the console. Fix them and rerun the command. Alternatively, skip formatting and linting with the `build` command:

```bash
elastic-package build
```

With the package built, run the following command from inside of the integration directory to recycle the package-registry docker container. This refreshes the {{fleet}} UI, allowing it to pick up the new integration in {{kib}}.

```bash
elastic-package stack up --services package-registry
```

38 changes: 38 additions & 0 deletions docs/extend/build-new-integration.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
---
mapped_pages:
- https://www.elastic.co/guide/en/integrations-developer/current/build-a-new-integration.html
---

# Build an integration [build-a-new-integration]

Ready to monitor, ingest, and visualize something? Let’s get started.

* [Overview and prerequisites](/extend/build-overview.md)
* [Spin up the {{stack}}](/extend/build-spin-stack.md)
* [Create a new package](/extend/build-create-package.md)
* [Add a data stream](/extend/add-data-stream.md)
* [Define deployment modes](/extend/define-deployment-modes.md)
* [Edit ingest pipelines](/extend/edit-ingest-pipeline.md)
* [Edit field mappings](/extend/add-mapping.md)
* [Create and export dashboards](/extend/create-dashboards.md)
* [Testing and validation](/extend/testing-validation.md)
* [Finishing touches](/extend/finishing-touches.md)
* [Tips for building integrations](/extend/tips-for-building.md)

::::{tip}
Familiar with the {{stack}} and just want a quick way to get started? See [*Quick start: Sample integration*](/extend/quick-start.md).
::::














14 changes: 14 additions & 0 deletions docs/extend/build-overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
---
mapped_pages:
- https://www.elastic.co/guide/en/integrations-developer/current/build-overview.html
---

# Overview and prerequisites [build-overview]

Before building an integration, you should have an understanding of the following:

* {{stack}} concepts, like data streams, ingest pipelines, and mappings
* The [*Package specification*](/extend/package-spec.md)

In addition, you must have [`elastic-package`](/extend/elastic-package.md) installed on your machine. Using `elastic-package` is recommended for integration maintainers as it provides crucial utilities and scripts for building out integrations.

Loading

0 comments on commit 10a6d56

Please sign in to comment.