Skip to content

Commit ad822fc

Browse files
author
Gus
authored
Removing non-ASCII characters (DataDog#2015)
1 parent 6330850 commit ad822fc

File tree

28 files changed

+106
-106
lines changed

28 files changed

+106
-106
lines changed

.github/PULL_REQUEST_TEMPLATE.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ What inspired you to submit this pull request?
1111
- [ ] PR has a [meaningful title](https://github.com/DataDog/integrations-core/blob/master/CONTRIBUTING.md#pull-request-title) or PR has the `no-changelog` label attached
1212
- [ ] Feature or bugfix has tests
1313
- [ ] Git history is clean
14-
- [ ] If PR impacts documentation, docs team has been notified or an issue has been opened on the [documentation repo](https://github.com/DataDog/documentation/issues/new)
14+
- [ ] If PR impacts documentation, docs team has been notified or an issue has been opened on the [documentation repo](https://github.com/DataDog/documentation/issues/new)
1515

1616
### Additional Notes
1717

activemq/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,7 @@ The check collects metrics via JMX, so you need a JVM on each node so the Agent
8080
metric_type: gauge
8181
MemoryPercentUsage:
8282
alias: activemq.broker.memory_pct
83-
         metric_type: gauge
83+
metric_type: gauge
8484
```
8585
8686
3. [Restart the agent][109]

consul/README.md

+7-7
Original file line numberDiff line numberDiff line change
@@ -107,13 +107,13 @@ Reload the Consul Agent to start sending more Consul metrics to DogStatsD.
107107
**Note**: If your Consul nodes have debug logging enabled, you'll see the Datadog Agent's regular polling in the Consul log:
108108
109109
```
110-
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/status/leader (59.344µs) from=127.0.0.1:53768
111-
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/status/peers (62.678µs) from=127.0.0.1:53770
112-
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/health/state/any (106.725µs) from=127.0.0.1:53772
113-
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/catalog/services (79.657µs) from=127.0.0.1:53774
114-
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/health/service/consul (153.917µs) from=127.0.0.1:53776
115-
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/coordinate/datacenters (71.778µs) from=127.0.0.1:53778
116-
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/coordinate/nodes (84.95µs) from=127.0.0.1:53780
110+
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/status/leader (59.344us) from=127.0.0.1:53768
111+
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/status/peers (62.678us) from=127.0.0.1:53770
112+
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/health/state/any (106.725us) from=127.0.0.1:53772
113+
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/catalog/services (79.657us) from=127.0.0.1:53774
114+
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/health/service/consul (153.917us) from=127.0.0.1:53776
115+
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/coordinate/datacenters (71.778us) from=127.0.0.1:53778
116+
2017/03/27 21:38:12 [DEBUG] http: Request GET /v1/coordinate/nodes (84.95us) from=127.0.0.1:53780
117117
```
118118
119119
#### Consul Agent to DogStatsD

docker_daemon/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ docker run -d --name dd-agent \
7070
-v /proc/:/host/proc/:ro \
7171
-v /cgroup/:/host/sys/fs/cgroup:ro \
7272
-e API_KEY={YOUR API KEY} \
73-
 datadog/docker-dd-agent:latest
73+
datadog/docker-dd-agent:latest
7474
```
7575

7676
#### Alpine Linux based container
@@ -85,7 +85,7 @@ docker run -d --name dd-agent \
8585
-v /proc/:/host/proc/:ro \
8686
-v /sys/fs/cgroup/:/host/sys/fs/cgroup:ro \
8787
-e API_KEY={YOUR API KEY} \
88-
 datadog/docker-dd-agent:latest-alpine
88+
datadog/docker-dd-agent:latest-alpine
8989
```
9090

9191
#### Image versioning

docs/dev/README.md

+6-6
Original file line numberDiff line numberDiff line change
@@ -5,24 +5,24 @@ kind: documentation
55

66
## Why create an Integration?
77

8-
[Custom Checks][11] are great for occasional reporting, or in cases where the data source is either unique or very limited. For more general use-cases such as application frameworks, open source projects, or commonly-used software it makes sense to write an Integration.
8+
[Custom Checks][11] are great for occasional reporting, or in cases where the data source is either unique or very limited. For more general use-cases - such as application frameworks, open source projects, or commonly-used software - it makes sense to write an Integration.
99

10-
Metrics reported from accepted Integrations are not counted as custom metrics, and therefore dont impact your custom metric allocation. (Integrations that emit potentially unlimited metrics may still be considered custom.) Ensuring native support for Datadog reduces friction to adoption, and incentivizes people to use your product, service, or project. Also, being featured within the Datadog ecosystem is a great avenue for added visibility.
10+
Metrics reported from accepted Integrations are not counted as custom metrics, and therefore don't impact your custom metric allocation. (Integrations that emit potentially unlimited metrics may still be considered custom.) Ensuring native support for Datadog reduces friction to adoption, and incentivizes people to use your product, service, or project. Also, being featured within the Datadog ecosystem is a great avenue for added visibility.
1111

12-
### Whats the process?
12+
### What's the process?
1313
The initial goal is to generate some code that collects the desired metrics in a reliable way, and to ensure that the general Integration framework is in place. Start by writing the basic functionality as a custom Check, then fill in the framework details from the [Create an Integration documentation][10].
1414

15-
Next, open a pull request against the [integrations-extras repository][6]. This signals to Datadog that youre ready to start reviewing code together. Dont worry if you have questions about tests, Datadog internals, or other topics the Integrations team is ready to help, and the pull request is a good place to go over those concerns. Be sure to take advantage of the [Community Office Hours][12] as well!
15+
Next, open a pull request against the [integrations-extras repository][6]. This signals to Datadog that you're ready to start reviewing code together. Don't worry if you have questions about tests, Datadog internals, or other topics - the Integrations team is ready to help, and the pull request is a good place to go over those concerns. Be sure to take advantage of the [Community Office Hours][12] as well!
1616

1717
Once the Integration has been validated (functionality, framework compliance, and general code quality) it will be merged into Extras. Once there, it becomes part of the Datadog ecosystem. Congratulations!
1818

1919
### What are your responsibilities?
2020

21-
Going forward, you as the author of the code are now the active maintainer of the Integration. Youre responsible for maintaining the code and ensuring the Integrations functionality. There is no specific time commitment, but we do ask that you only agree to become a maintainer if you feel that you can take care of the code for the foreseeable future. Datadog extends support on a best-effort basis for Extras, so you wont be on your own!
21+
Going forward, you - as the author of the code - are now the active maintainer of the Integration. You're responsible for maintaining the code and ensuring the Integration's functionality. There is no specific time commitment, but we do ask that you only agree to become a maintainer if you feel that you can take care of the code for the foreseeable future. Datadog extends support on a best-effort basis for Extras, so you won't be on your own!
2222

2323
## Let's get started!
2424

25-
All of the detailsincluding prerequisites, code examples, and moreare in the [Create a new Integration][10] documentation.
25+
All of the details-including prerequisites, code examples, and more-are in the [Create a new Integration][10] documentation.
2626

2727
[1]: https://docs.datadoghq.com/developers/metrics/
2828
[6]: https://github.com/DataDog/integrations-extras

docs/dev/new_check_howto.md

+15-15
Original file line numberDiff line numberDiff line change
@@ -51,19 +51,19 @@ Answer the questions when prompted. Once done, you should end up with something
5151
├── MANIFEST.in
5252
├── README.md
5353
├── datadog_checks
54-
   ├── __init__.py
55-
   └── foo_check
56-
   └── data
57-
   └── conf.yaml.example
58-
   ├── __about__.py
59-
   ├── __init__.py
60-
   └── foo_check.py
54+
├── __init__.py
55+
└── foo_check
56+
└── data
57+
└── conf.yaml.example
58+
├── __about__.py
59+
├── __init__.py
60+
└── foo_check.py
6161
├── images
62-
   └── snapshot.png
62+
└── snapshot.png
6363
├── logos
64-
   ├── avatars-bot.png
65-
   ├── saas_logos-bot.png
66-
   └── saas_logos-small.png
64+
├── avatars-bot.png
65+
├── saas_logos-bot.png
66+
└── saas_logos-small.png
6767
├── manifest.json
6868
├── metadata.csv
6969
├── requirements-dev.txt
@@ -72,9 +72,9 @@ Answer the questions when prompted. Once done, you should end up with something
7272
├── service_checks.json
7373
├── setup.py
7474
├── tests
75-
   ├── __init__.py
76-
   ├── conftest.py
77-
   └── test_check.py
75+
├── __init__.py
76+
├── conftest.py
77+
└── test_check.py
7878
└── tox.ini
7979
```
8080

@@ -361,7 +361,7 @@ Our check sends a Service Check, so we need to add it to the `service_checks.jso
361361
]
362362
```
363363

364-
Find below the description for each attributeseach one of them is mandatoryof your `service_checks.json` file:
364+
Find below the description for each attributes-each one of them is mandatory-of your `service_checks.json` file:
365365

366366
| Attribute | Description |
367367
| ---- | ---- |

docs/dev/python.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ Any recent version of macOS comes with Python pre-installed; however, it might b
1919

2020
#### Option 1: Install Python with Homebrew
2121

22-
[`Homebrew`][3] is a package manager for macOS that makes a lot easier installing software on macOS, specially from the command line. Follow the "Doing it Right" instructions in [the Hitchhikers Guide to Python][4].
22+
[`Homebrew`][3] is a package manager for macOS that makes a lot easier installing software on macOS, specially from the command line. Follow the "Doing it Right" instructions in [the Hitchhiker's Guide to Python][4].
2323

2424
#### Option 2: Install Python with miniconda
2525

@@ -41,7 +41,7 @@ Each integration has its own set of dependencies that must be added to Python in
4141

4242
### Virtualenv and Virtualenvwrapper
4343

44-
We recommend using [Virtualenv][8] to manage Python virtual environments, and [virtualenvwrapper][9] to make the process smoother. There's a [comprehensive guide][10] in the Hitchhikers Guide to Python describing how to set up these two tools.
44+
We recommend using [Virtualenv][8] to manage Python virtual environments, and [virtualenvwrapper][9] to make the process smoother. There's a [comprehensive guide][10] in the Hitchhiker's Guide to Python describing how to set up these two tools.
4545

4646
### Miniconda
4747

docs/proposals/checks_as_wheels.md

+18-18
Original file line numberDiff line numberDiff line change
@@ -12,13 +12,13 @@ coincide with the name of the module) and the path to the folder containing the
1212
source file. See https://github.com/DataDog/dd-agent/blob/5.14.x/config.py#L830.
1313

1414
When checks were moved in a separate repo to implement the concept of
15-
Integrations SDK, to minimize the work on the Agent while still support this
15+
"Integrations SDK", to minimize the work on the Agent while still support this
1616
new way of distributing checks, the same logic was kept and now such Python
1717
modules are allowed to live in a different directory.
1818

1919
## Problem
2020

21-
At the moment, the Agent package contains all the checks marked as core and
21+
At the moment, the Agent package contains all the checks marked as "core" and
2222
officially maintained by Datadog, available at https://github.com/DataDog/integrations-core.
2323
After being decoupled from the Agent, a check can now be installed as a separate
2424
package and picked up by the Agent at runtime, taking precedence over the one
@@ -33,15 +33,15 @@ installer).
3333
The current implementation of Integrations SDK exposes the Agent to a set of
3434
different problems, part of them are legacy and were already present in the
3535
Agent before moving out the checks but part of them were introduced in the
36-
process of implementing the SDK, lets see few examples.
36+
process of implementing the SDK, let's see few examples.
3737

3838

3939
### Building
4040

4141
At this moment, if contributors want to build a patched version of a check to
4242
use within their infrastructure, they have to either replicate our own build
4343
pipeline or create their own. The latter is way simpler than the former but when
44-
choosing this path youre basically alone, being able to reuse very little code
44+
choosing this path you're basically alone, being able to reuse very little code
4545
and tools from the agent codebase which is strongly focused on supporting our
4646
own build pipeline. This also affects the development process: since a check is
4747
not meant to live without an agent, the only way you have to test a modified
@@ -51,10 +51,10 @@ agent - both strategies carry on a long list of issues.
5151
### Versioning
5252

5353
Despite having separated the checks from the agent, the two codebases are still
54-
strongly coupled when it comes to versioning. Checks are supposed to be released in standalone mode, without a corresponding agent release, and in a manual fashion: we decide when a new release for a check is needed and we trigger the release process. This means we might have changes piling up on `master` between one release and another which is fine but doesnt play well when an agent
54+
strongly coupled when it comes to versioning. Checks are supposed to be released in standalone mode, without a corresponding agent release, and in a manual fashion: we decide when a new release for a check is needed and we trigger the release process. This means we might have changes piling up on `master` between one release and another which is fine but doesn't play well when an agent
5555
release falls in the middle: when we release the agent, we embed all the checks
5656
we find on `master` in `integrations-core`, leaving the check in an inconsistent
57-
state: not released as standalone but released with an agent. A workaround to
57+
state: "not released as standalone but released with an agent". A workaround to
5858
this would be forcing a standalone release for all the checks in `integrations-core`
5959
when we release an agent, (a fix is in place starting with 5.15) but the
6060
process is time consuming and not straightforward.
@@ -70,7 +70,7 @@ dependencies but we implement the logic elsewhere.
7070
### User experience: final user
7171

7272
At the moment we are exposed to weird corner cases when it comes to the point of
73-
installing checks in standalone mode, lets see an example:
73+
installing checks in standalone mode, let's see an example:
7474

7575
* User installs agent 5.0.0 shipping ntp check 1.0.0
7676
* Agent 5.1.0 gets released, shipping ntp check 1.1.0
@@ -93,7 +93,7 @@ Most if not all the checks are not supposed to run outside the agent lifecycle,
9393
working on a brand new check or trying to change an existing one heavily rely on
9494
this, making things more complicated than they could be.
9595

96-
A couple of related, minor issues it would be nice to fix: at the moment its not
96+
A couple of related, minor issues it would be nice to fix: at the moment it's not
9797
possible to split the code of a check across multiple Python modules, same
9898
happens for the tests.
9999

@@ -125,9 +125,9 @@ The ability of the agent to progammatically import and run a single Python
125125
module from a well known path in the filesystem will be preserved, so that
126126
we don't break any custom check in the wild.
127127

128-
Moving to wheels wouldnt solve any possible problem were facing in building
128+
Moving to wheels wouldn't solve any possible problem we're facing in building
129129
and shipping `integrations-core` and `integrations-extra` but overall it might
130-
work better, lets see few examples.
130+
work better, let's see few examples.
131131

132132
#### Building
133133

@@ -148,7 +148,7 @@ each agent release we would pick and include in the package the desired version
148148
of each check in integrations-core by invoking `pip install -r` on a special
149149
requirements.txt file that would be part of the agent repository. That file
150150
would likely contain the most recent version of each check wheel package but
151-
this wouldnt be enforced, what it counts is that the requirements file would be
151+
this wouldn't be enforced, what it counts is that the requirements file would be
152152
the unique source of truth stating which checks are shipped with which agent.
153153

154154
#### Dependencies
@@ -188,7 +188,7 @@ from pypi.org directly, without forcing contributors to install the full
188188
agent package on their laptop.
189189

190190
#### Upgrade path
191-
_Note: this will affect the end user experience._
191+
_Note: this will affect the "end user" experience._
192192

193193
##### Recommended solution: nuke core dependencies, keep custom checks
194194

@@ -202,15 +202,15 @@ Pros:
202202
pip.
203203

204204
Cons:
205-
* Users wouldnt be able to pin specific check versions (this could be a non
205+
* Users wouldn't be able to pin specific check versions (this could be a non
206206
requirement)
207207

208208
Errors might still happen at runtime if a custom check relies on a dependency
209-
that was updated along with the agent upgrade (not sure theres a fix for this).
209+
that was updated along with the agent upgrade (not sure there's a fix for this).
210210

211211
##### Alternative solution 1: nuke allthethings
212212

213-
We consider the core checks as part of the embedded Python standard library.
213+
We consider the core checks as "part of the embedded Python standard library".
214214
At every Agent upgrade, the list of Python packages installed in the embedded
215215
interpreter would be reset.
216216

@@ -220,7 +220,7 @@ Pros:
220220

221221
Cons:
222222
* Custom checks implemented as wheels would be wiped and users should add them
223-
back (this doesnt affect custom checks installed with the old method)
223+
back (this doesn't affect custom checks installed with the old method)
224224
* Pinned versions of core checks would be overwritten by newer ones (again, this
225225
might be a non requirement)
226226

@@ -258,7 +258,7 @@ or even:
258258
#### User experience: developer
259259

260260
Eventually the developer user experience would be really Pythonic, in the sense
261-
that working on a check wouldnt be that different from working on any other
261+
that working on a check wouldn't be that different from working on any other
262262
python project: same tools, same concepts. Along with the new packaging it would
263263
come the opportunity to split the code across multiple files in the same package,
264264
easily run unit tests locally, the ability to build and install a custom version
@@ -278,7 +278,7 @@ package installed, then tests will work as the agent was there.
278278

279279
This is not applicable to Agent6, since the `aggregator` Python module only
280280
exists in memory when the Go agent is running the embedded Python. We can
281-
provide a mocked `aggregator` but tests wouldnt be reliable with the current
281+
provide a mocked `aggregator` but tests wouldn't be reliable with the current
282282
testing approach. The solution to this would be to slightly change our tests so
283283
that instead of simulating a complete collection cycle and see what arrives
284284
to the forwarder, we invoke the `check` method and look what arrives to the the

elastic/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ instances:
3737
3838
**Note**:
3939
40-
* If you're collecting Elasticsearch metrics from just one Datadog Agent running outside the cluster e.g. if you use a hosted Elasticsearch set `cluster_stats` to true.
40+
* If you're collecting Elasticsearch metrics from just one Datadog Agent running outside the cluster - e.g. if you use a hosted Elasticsearch - set `cluster_stats` to true.
4141

4242
* To use the Agent's Elasticsearch integration for the AWS Elasticsearch services, set the `url` parameter to point to your AWS Elasticsearch stats URL.
4343

@@ -89,7 +89,7 @@ See [metadata.csv][6] for a list of metrics provided by this integration.
8989

9090
### Events
9191

92-
The Elasticsearch check emits an event to Datadog each time the overall status of your Elasticsearch cluster changes red, yellow, or green.
92+
The Elasticsearch check emits an event to Datadog each time the overall status of your Elasticsearch cluster changes - red, yellow, or green.
9393

9494
### Service checks
9595

0 commit comments

Comments
 (0)