You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: .github/PULL_REQUEST_TEMPLATE.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ What inspired you to submit this pull request?
11
11
-[ ] PR has a [meaningful title](https://github.com/DataDog/integrations-core/blob/master/CONTRIBUTING.md#pull-request-title) or PR has the `no-changelog` label attached
12
12
-[ ] Feature or bugfix has tests
13
13
-[ ] Git history is clean
14
-
-[] If PR impacts documentation, docs team has been notified or an issue has been opened on the [documentation repo](https://github.com/DataDog/documentation/issues/new)
14
+
-[] If PR impacts documentation, docs team has been notified or an issue has been opened on the [documentation repo](https://github.com/DataDog/documentation/issues/new)
Copy file name to clipboardexpand all lines: docs/dev/README.md
+6-6
Original file line number
Diff line number
Diff line change
@@ -5,24 +5,24 @@ kind: documentation
5
5
6
6
## Why create an Integration?
7
7
8
-
[Custom Checks][11] are great for occasional reporting, or in cases where the data source is either unique or very limited. For more general use-cases — such as application frameworks, open source projects, or commonly-used software — it makes sense to write an Integration.
8
+
[Custom Checks][11] are great for occasional reporting, or in cases where the data source is either unique or very limited. For more general use-cases - such as application frameworks, open source projects, or commonly-used software - it makes sense to write an Integration.
9
9
10
-
Metrics reported from accepted Integrations are not counted as custom metrics, and therefore don’t impact your custom metric allocation. (Integrations that emit potentially unlimited metrics may still be considered custom.) Ensuring native support for Datadog reduces friction to adoption, and incentivizes people to use your product, service, or project. Also, being featured within the Datadog ecosystem is a great avenue for added visibility.
10
+
Metrics reported from accepted Integrations are not counted as custom metrics, and therefore don't impact your custom metric allocation. (Integrations that emit potentially unlimited metrics may still be considered custom.) Ensuring native support for Datadog reduces friction to adoption, and incentivizes people to use your product, service, or project. Also, being featured within the Datadog ecosystem is a great avenue for added visibility.
11
11
12
-
### What’s the process?
12
+
### What's the process?
13
13
The initial goal is to generate some code that collects the desired metrics in a reliable way, and to ensure that the general Integration framework is in place. Start by writing the basic functionality as a custom Check, then fill in the framework details from the [Create an Integration documentation][10].
14
14
15
-
Next, open a pull request against the [integrations-extras repository][6]. This signals to Datadog that you’re ready to start reviewing code together. Don’t worry if you have questions about tests, Datadog internals, or other topics — the Integrations team is ready to help, and the pull request is a good place to go over those concerns. Be sure to take advantage of the [Community Office Hours][12] as well!
15
+
Next, open a pull request against the [integrations-extras repository][6]. This signals to Datadog that you're ready to start reviewing code together. Don't worry if you have questions about tests, Datadog internals, or other topics - the Integrations team is ready to help, and the pull request is a good place to go over those concerns. Be sure to take advantage of the [Community Office Hours][12] as well!
16
16
17
17
Once the Integration has been validated (functionality, framework compliance, and general code quality) it will be merged into Extras. Once there, it becomes part of the Datadog ecosystem. Congratulations!
18
18
19
19
### What are your responsibilities?
20
20
21
-
Going forward, you — as the author of the code — are now the active maintainer of the Integration. You’re responsible for maintaining the code and ensuring the Integration’s functionality. There is no specific time commitment, but we do ask that you only agree to become a maintainer if you feel that you can take care of the code for the foreseeable future. Datadog extends support on a best-effort basis for Extras, so you won’t be on your own!
21
+
Going forward, you - as the author of the code - are now the active maintainer of the Integration. You're responsible for maintaining the code and ensuring the Integration's functionality. There is no specific time commitment, but we do ask that you only agree to become a maintainer if you feel that you can take care of the code for the foreseeable future. Datadog extends support on a best-effort basis for Extras, so you won't be on your own!
22
22
23
23
## Let's get started!
24
24
25
-
All of the details—including prerequisites, code examples, and more—are in the [Create a new Integration][10] documentation.
25
+
All of the details-including prerequisites, code examples, and more-are in the [Create a new Integration][10] documentation.
Copy file name to clipboardexpand all lines: docs/dev/python.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -19,7 +19,7 @@ Any recent version of macOS comes with Python pre-installed; however, it might b
19
19
20
20
#### Option 1: Install Python with Homebrew
21
21
22
-
[`Homebrew`][3] is a package manager for macOS that makes a lot easier installing software on macOS, specially from the command line. Follow the "Doing it Right" instructions in [the Hitchhiker’s Guide to Python][4].
22
+
[`Homebrew`][3] is a package manager for macOS that makes a lot easier installing software on macOS, specially from the command line. Follow the "Doing it Right" instructions in [the Hitchhiker's Guide to Python][4].
23
23
24
24
#### Option 2: Install Python with miniconda
25
25
@@ -41,7 +41,7 @@ Each integration has its own set of dependencies that must be added to Python in
41
41
42
42
### Virtualenv and Virtualenvwrapper
43
43
44
-
We recommend using [Virtualenv][8] to manage Python virtual environments, and [virtualenvwrapper][9] to make the process smoother. There's a [comprehensive guide][10] in the Hitchhiker’s Guide to Python describing how to set up these two tools.
44
+
We recommend using [Virtualenv][8] to manage Python virtual environments, and [virtualenvwrapper][9] to make the process smoother. There's a [comprehensive guide][10] in the Hitchhiker's Guide to Python describing how to set up these two tools.
Copy file name to clipboardexpand all lines: docs/proposals/checks_as_wheels.md
+18-18
Original file line number
Diff line number
Diff line change
@@ -12,13 +12,13 @@ coincide with the name of the module) and the path to the folder containing the
12
12
source file. See https://github.com/DataDog/dd-agent/blob/5.14.x/config.py#L830.
13
13
14
14
When checks were moved in a separate repo to implement the concept of
15
-
“Integrations SDK”, to minimize the work on the Agent while still support this
15
+
"Integrations SDK", to minimize the work on the Agent while still support this
16
16
new way of distributing checks, the same logic was kept and now such Python
17
17
modules are allowed to live in a different directory.
18
18
19
19
## Problem
20
20
21
-
At the moment, the Agent package contains all the checks marked as “core” and
21
+
At the moment, the Agent package contains all the checks marked as "core" and
22
22
officially maintained by Datadog, available at https://github.com/DataDog/integrations-core.
23
23
After being decoupled from the Agent, a check can now be installed as a separate
24
24
package and picked up by the Agent at runtime, taking precedence over the one
@@ -33,15 +33,15 @@ installer).
33
33
The current implementation of Integrations SDK exposes the Agent to a set of
34
34
different problems, part of them are legacy and were already present in the
35
35
Agent before moving out the checks but part of them were introduced in the
36
-
process of implementing the SDK, let’s see few examples.
36
+
process of implementing the SDK, let's see few examples.
37
37
38
38
39
39
### Building
40
40
41
41
At this moment, if contributors want to build a patched version of a check to
42
42
use within their infrastructure, they have to either replicate our own build
43
43
pipeline or create their own. The latter is way simpler than the former but when
44
-
choosing this path you’re basically alone, being able to reuse very little code
44
+
choosing this path you're basically alone, being able to reuse very little code
45
45
and tools from the agent codebase which is strongly focused on supporting our
46
46
own build pipeline. This also affects the development process: since a check is
47
47
not meant to live without an agent, the only way you have to test a modified
@@ -51,10 +51,10 @@ agent - both strategies carry on a long list of issues.
51
51
### Versioning
52
52
53
53
Despite having separated the checks from the agent, the two codebases are still
54
-
strongly coupled when it comes to versioning. Checks are supposed to be released in standalone mode, without a corresponding agent release, and in a manual fashion: we decide when a new release for a check is needed and we trigger the release process. This means we might have changes piling up on `master` between one release and another which is fine but doesn’t play well when an agent
54
+
strongly coupled when it comes to versioning. Checks are supposed to be released in standalone mode, without a corresponding agent release, and in a manual fashion: we decide when a new release for a check is needed and we trigger the release process. This means we might have changes piling up on `master` between one release and another which is fine but doesn't play well when an agent
55
55
release falls in the middle: when we release the agent, we embed all the checks
56
56
we find on `master` in `integrations-core`, leaving the check in an inconsistent
57
-
state: “not released as standalone but released with an agent”. A workaround to
57
+
state: "not released as standalone but released with an agent". A workaround to
58
58
this would be forcing a standalone release for all the checks in `integrations-core`
59
59
when we release an agent, (a fix is in place starting with 5.15) but the
60
60
process is time consuming and not straightforward.
@@ -70,7 +70,7 @@ dependencies but we implement the logic elsewhere.
70
70
### User experience: final user
71
71
72
72
At the moment we are exposed to weird corner cases when it comes to the point of
73
-
installing checks in standalone mode, let’s see an example:
73
+
installing checks in standalone mode, let's see an example:
74
74
75
75
* User installs agent 5.0.0 shipping ntp check 1.0.0
Copy file name to clipboardexpand all lines: elastic/README.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -37,7 +37,7 @@ instances:
37
37
38
38
**Note**:
39
39
40
-
* If you're collecting Elasticsearch metrics from just one Datadog Agent running outside the cluster — e.g. if you use a hosted Elasticsearch — set `cluster_stats` to true.
40
+
* If you're collecting Elasticsearch metrics from just one Datadog Agent running outside the cluster - e.g. if you use a hosted Elasticsearch - set `cluster_stats` to true.
41
41
42
42
* To use the Agent's Elasticsearch integration for the AWS Elasticsearch services, set the `url` parameter to point to your AWS Elasticsearch stats URL.
43
43
@@ -89,7 +89,7 @@ See [metadata.csv][6] for a list of metrics provided by this integration.
89
89
90
90
### Events
91
91
92
-
The Elasticsearch check emits an event to Datadog each time the overall status of your Elasticsearch cluster changes — red, yellow, or green.
92
+
The Elasticsearch check emits an event to Datadog each time the overall status of your Elasticsearch cluster changes - red, yellow, or green.
0 commit comments