PR #3977
6 errors, 16 fail, 1 skipped, 335 pass in 28m 21s
Annotations
Check failure on line 0 in storm.tests.test_storm
github-actions / Test Results
test_integration_with_ci_cluster (storm.tests.test_storm) with error
test-results/Storm/test-unit-py3.12.xml [took 42s]
Raw output
failed on setup with "tenacity.RetryError: RetryError[<Future at 0x7f3739b72c60 state=finished raised SubprocessError>]"
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-storm/LlOCFcZl/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:478: in __call__
#x1B[0mresult = fn(*args, **kwargs)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-storm/LlOCFcZl/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:98: in set_up_with_retry
#x1B[0m#x1B[94mreturn#x1B[39;49;00m set_up()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-storm/LlOCFcZl/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:79: in set_up
#x1B[0mset_up_result = up()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-storm/LlOCFcZl/py3.12/lib/python3.12/site-packages/datadog_checks/dev/docker.py#x1B[0m:253: in __call__
#x1B[0m#x1B[94mreturn#x1B[39;49;00m run_command(#x1B[96mself#x1B[39;49;00m.command, **args)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-storm/LlOCFcZl/py3.12/lib/python3.12/site-packages/datadog_checks/dev/subprocess.py#x1B[0m:75: in run_command
#x1B[0m#x1B[94mraise#x1B[39;49;00m SubprocessError(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE datadog_checks.dev.errors.SubprocessError: Command: ['docker', 'compose', '-f', '/home/runner/work/integrations-extras/integrations-extras/storm/tests/compose/docker-compose.yaml', 'up', '-d', '--force-recreate', '--build', 'topology-maker']#x1B[0m
#x1B[1m#x1B[31mE Exit code: 17#x1B[0m
#x1B[1m#x1B[31mE Captured Output:#x1B[0m
#x1B[33mThe above exception was the direct cause of the following exception:#x1B[0m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-storm/LlOCFcZl/py3.12/lib/python3.12/site-packages/flaky/flaky_pytest_plugin.py#x1B[0m:146: in <lambda>
#x1B[0m#x1B[94mlambda#x1B[39;49;00m: ihook(item=item, **kwds), when=when, reraise=reraise#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mtests/conftest.py#x1B[0m:26: in dd_environment
#x1B[0m#x1B[94mwith#x1B[39;49;00m docker_run(compose_file, build=#x1B[94mTrue#x1B[39;49;00m, service_name=#x1B[33m'#x1B[39;49;00m#x1B[33mtopology-maker#x1B[39;49;00m#x1B[33m'#x1B[39;49;00m, sleep=#x1B[94m15#x1B[39;49;00m):#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/contextlib.py#x1B[0m:137: in __enter__
#x1B[0m#x1B[94mreturn#x1B[39;49;00m #x1B[96mnext#x1B[39;49;00m(#x1B[96mself#x1B[39;49;00m.gen)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-storm/LlOCFcZl/py3.12/lib/python3.12/site-packages/datadog_checks/dev/docker.py#x1B[0m:220: in docker_run
#x1B[0m#x1B[94mwith#x1B[39;49;00m environment_run(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/contextlib.py#x1B[0m:137: in __enter__
#x1B[0m#x1B[94mreturn#x1B[39;49;00m #x1B[96mnext#x1B[39;49;00m(#x1B[96mself#x1B[39;49;00m.gen)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-storm/LlOCFcZl/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:110: in environment_run
#x1B[0mresult = set_up_func()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-storm/LlOCFcZl/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:336: in wrapped_f
#x1B[0m#x1B[94mreturn#x1B[39;49;00m copy(f, *args, **kw)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-storm/LlOCFcZl/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:475: in __call__
#x1B[0mdo = #x1B[96mself#x1B[39;49;00m.iter(retry_state=retry_state)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-storm/LlOCFcZl/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:376: in iter
#x1B[0mresult = action(retry_state)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-storm/LlOCFcZl/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:419: in exc_check
#x1B[0m#x1B[94mraise#x1B[39;49;00m retry_exc #x1B[94mfrom#x1B[39;49;00m #x1B[04m#x1B[96mfut#x1B[39;49;00m#x1B[04m#x1B[96m.#x1B[39;49;00m#x1B[04m#x1B[96mexception#x1B[39;49;00m()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE tenacity.RetryError: RetryError[<Future at 0x7f3739b72c60 state=finished raised SubprocessError>]#x1B[0m
Check warning on line 0 in zabbix.tests.test_integration
github-actions / Test Results
1 out of 2 runs failed: test_e2e (zabbix.tests.test_integration)
test-results/Zabbix (Community Version)/test-e2e-py3.12.xml [took 1s]
Raw output
ValueError: [s6-init] making user provided files available at /var/run/s6/etc...exited 0.
[s6-init] ensuring user provided files have correct perms...exited 0.
[fix-attrs.d] applying ownership & permissions fixes...
[fix-attrs.d] done.
[cont-init.d] executing container initialization scripts...
[cont-init.d] 01-check-apikey.sh: executing...
[cont-init.d] 01-check-apikey.sh: exited 0.
[cont-init.d] 50-ci.sh: executing...
[cont-init.d] 50-ci.sh: exited 0.
[cont-init.d] 50-ecs.sh: executing...
[cont-init.d] 50-ecs.sh: exited 0.
[cont-init.d] 50-eks.sh: executing...
[cont-init.d] 50-eks.sh: exited 0.
[cont-init.d] 50-kubernetes.sh: executing...
[cont-init.d] 50-kubernetes.sh: exited 0.
[cont-init.d] 50-mesos.sh: executing...
[cont-init.d] 50-mesos.sh: exited 0.
[cont-init.d] 51-docker.sh: executing...
[cont-init.d] 51-docker.sh: exited 0.
[cont-init.d] 59-defaults.sh: executing...
[cont-init.d] 59-defaults.sh: exited 0.
[cont-init.d] 60-network-check.sh: executing...
[cont-init.d] 60-network-check.sh: exited 0.
[cont-init.d] 60-sysprobe-check.sh: executing...
[cont-init.d] 60-sysprobe-check.sh: exited 0.
[cont-init.d] 89-copy-customfiles.sh: executing...
[cont-init.d] 89-copy-customfiles.sh: exited 0.
[cont-init.d] done.
[services.d] starting services
starting trace-agent
starting agent
starting security-agent
starting system-probe
starting process-agent
2024-12-20 00:03:03 UTC | TRACE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-20 00:03:03 UTC | TRACE | INFO | (pkg/util/log/log.go:840 in func1) | Loading proxy settings
2024-12-20 00:03:03 UTC | TRACE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to resolve secrets
2024-12-20 00:03:03 UTC | TRACE | WARN | (pkg/util/log/log.go:885 in func1) | Agent configuration relax permissions constraint on the secret backend cmd, Group can read and exec
2024-12-20 00:03:03 UTC | TRACE | INFO | (pkg/util/log/log.go:840 in func1) | Finished resolving secrets
2024-12-20 00:03:03 UTC | TRACE | INFO | (pkg/util/log/log.go:845 in func1) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-20 00:03:03 UTC | TRACE | INFO | (pkg/util/log/log.go:845 in func1) | 0 Features detected from environment:
2024-12-20 00:03:03 UTC | TRACE | INFO | (comp/trace/config/setup.go:75 in LoadConfigFile) | Loaded configuration: /etc/datadog-agent/datadog.yaml
2024-12-20 00:03:03 UTC | TRACE | INFO | (comp/trace/config/setup.go:353 in applyDatadogConfig) | Activating non-local traffic automatically in containerized environment, trace-agent will listen on 0.0.0.0
2024-12-20 00:03:03 UTC | TRACE | INFO | (comp/trace/agent/impl/agent.go:107 in NewAgent) | trace-agent not enabled. Set the environment variable
DD_APM_ENABLED=true or add "apm_config.enabled: true" entry
to your datadog.yaml. Exiting...
[services.d] done.
2024-12-20 00:03:03 UTC | SECURITY | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-20 00:03:03 UTC | SECURITY | INFO | (pkg/util/log/log.go:840 in func1) | Loading proxy settings
2024-12-20 00:03:03 UTC | SECURITY | INFO | (pkg/util/log/log.go:840 in func1) | Starting to resolve secrets
2024-12-20 00:03:03 UTC | SECURITY | WARN | (pkg/util/log/log.go:885 in func1) | Agent configuration relax permissions constraint on the secret backend cmd, Group can read and exec
2024-12-20 00:03:03 UTC | SECURITY | INFO | (pkg/util/log/log.go:840 in func1) | Finished resolving secrets
2024-12-20 00:03:03 UTC | SECURITY | INFO | (pkg/util/log/log.go:845 in func1) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-20 00:03:03 UTC | SECURITY | INFO | (pkg/util/log/log.go:845 in func1) | 0 Features detected from environment:
2024-12-20 00:03:03 UTC | SECURITY | INFO | (pkg/config/setup/config.go:2051 in LoadCustom) | Starting to load the configuration
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (pkg/config/setup/config.go:2051 in LoadCustom) | Starting to load the configuration
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (pkg/config/setup/config.go:1709 in LoadProxyFromEnv) | Loading proxy settings
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (pkg/config/env/environment_containers.go:242 in detectPodResources) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (pkg/config/env/environment_detection.go:124 in DetectFeatures) | 0 Features detected from environment:
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (pkg/config/remote/client/client.go:400 in pollLoop) | retrying the first update of remote-config state (could not acquire agent auth token: unable to read authentication token file: open /etc/datadog-agent/auth_token: no such file or directory)
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:127 in Start) | remote workloadmeta initialized successfully
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/impl/store.go:554 in startCandidates) | workloadmeta collector "remote-workloadmeta" started successfully
2024-12-20 00:03:03 UTC | SYS-PROBE | WARN | (comp/core/workloadmeta/collectors/internal/remote/generic.go:154 in func1) | unable to establish entity stream between agents, will possibly retry: unable to fetch authentication token: unable to read authentication token file: open /etc/datadog-agent/auth_token: no such file or directory
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:199 in Start) | remote tagger initialized successfully
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (pkg/runtime/runtime.go:28 in func1) | runtime: set GOMAXPROCS to: 4
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:551 in func1) | unable to fetch auth token, will possibly retry: unable to read authentication token file: open /etc/datadog-agent/auth_token: no such file or directory
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (cmd/system-probe/subcommands/run/command.go:327 in startSystemProbe) | starting system-probe v7.62.0-devel+git.480.f5d99e9
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (cmd/system-probe/subcommands/run/command.go:439 in logUserAndGroupID) | current user id/name: 0/root
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (cmd/system-probe/subcommands/run/command.go:442 in logUserAndGroupID) | current group id/name: 0/root
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (cmd/system-probe/subcommands/run/command.go:332 in startSystemProbe) | system probe not enabled. exiting
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:845 in func1) | running on platform: ubuntu
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:845 in func1) | running version: 7.62.0-devel
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:840 in func1) | Loading proxy settings
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:840 in func1) | Starting to resolve secrets
2024-12-20 00:03:03 UTC | PROCESS | WARN | (pkg/util/log/log.go:885 in func1) | Agent configuration relax permissions constraint on the secret backend cmd, Group can read and exec
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:840 in func1) | Finished resolving secrets
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:845 in func1) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:845 in func1) | 0 Features detected from environment:
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/config/setup/config.go:2051 in LoadCustom) | Starting to load the configuration
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/util/log/log.go:840 in func1) | Loading proxy settings
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/rdnsquerier/impl/rdnsquerier.go:83 in NewComponent) | Reverse DNS Enrichment config: (enabled=false workers=0 chan_size=0 cache.enabled=true cache.entry_ttl=0 cache.clean_interval=0 cache.persist_interval=0 cache.max_retries=-1 cache.max_size=0 rate_limiter.enabled=true rate_limiter.limit_per_sec=0 rate_limiter.limit_throttled_per_sec=0 rate_limiter.throttle_error_threshold=0 rate_limiter.recovery_intervals=0 rate_limiter.recovery_interval=0)
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to resolve secrets
2024-12-20 00:03:03 UTC | CORE | WARN | (pkg/util/log/log.go:885 in func1) | Agent configuration relax permissions constraint on the secret backend cmd, Group can read and exec
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/util/log/log.go:840 in func1) | Finished resolving secrets
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/util/log/log.go:845 in func1) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/util/log/log.go:845 in func1) | 0 Features detected from environment:
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/runtime/runtime.go:28 in func1) | runtime: set GOMAXPROCS to: 4
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/process/metadata/workloadmeta/extractor.go:84 in NewWorkloadMetaExtractor) | Instantiating a new WorkloadMetaExtractor
2024-12-20 00:03:03 UTC | CORE | INFO | (comp/core/tagger/impl/tagger.go:124 in NewComponent) | TaggerClient is created, defaultTagger type: *taggerimpl.localTagger
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/util/containers/metrics/system/collector_linux.go:87 in newSystemCollector) | Unable to initialize cgroup provider (cgroups not mounted?), err: unable to detect cgroup version from detected mount points: map[]
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/config/utils/endpoints.go:35 in getResolvedDDUrl) | 'site' and 'dd_url' are both set in config: setting main endpoint to 'dd_url': "https://app.datadoghq.com"
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/config/utils/endpoints.go:35 in getResolvedDDUrl) | 'site' and 'dd_url' are both set in config: setting main endpoint to 'dd_url': "https://app.datadoghq.com"
2024-12-20 00:03:03 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:03:03 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:150 in GetDogStatsDWorkerAndPipelineCount) | Dogstatsd configured to run with 2 workers and 1 pipelines
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/aggregator/time_sampler.go:54 in NewTimeSampler) | Creating TimeSampler #0
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/config/setup/config.go:2051 in LoadCustom) | Starting to load the configuration
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/api/security/security.go:142 in fetchAuthToken) | [/omnibus/src/datadog-agent/src/github.com/DataDog/datadog-agent/pkg/api/util/util.go:107] Creating a new authentication token
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/api/security/security.go:241 in saveAuthToken) | Saving a new authentication token in /etc/datadog-agent/auth_token
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/tagger/impl/tagger.go:124 in NewComponent) | TaggerClient is created, defaultTagger type: *taggerimpl.localTagger
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/util/containers/metrics/system/collector_linux.go:87 in newSystemCollector) | Unable to initialize cgroup provider (cgroups not mounted?), err: unable to detect cgroup version from detected mount points: map[]
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/process/metadata/workloadmeta/extractor.go:84 in NewWorkloadMetaExtractor) | Instantiating a new WorkloadMetaExtractor
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/api/security/security.go:256 in saveAuthToken) | Wrote auth token in /etc/datadog-agent/auth_token
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/api/security/security.go:155 in fetchAuthToken) | Saved a new authentication token to /etc/datadog-agent/auth_token
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/api/security/cert/cert_getter.go:59 in fetchAgentIPCCert) | [/omnibus/src/datadog-agent/src/github.com/DataDog/datadog-agent/pkg/api/util/util.go:111] Creating a new IPC certificate
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/api/security/cert/cert_getter.go:107 in saveIPCCertKey) | Saving a new IPC certificate/key pair in /etc/datadog-agent/ipc_cert.pem
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/api/security/cert/cert_getter.go:126 in saveIPCCertKey) | Wrote IPC certificate/key pair in /etc/datadog-agent/ipc_cert.pem
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/api/security/cert/cert_getter.go:73 in fetchAgentIPCCert) | Saved a new IPC certificate/key pair to /etc/datadog-agent/ipc_cert.pem
2024-12-20 00:03:03 UTC | CORE | INFO | (comp/logs/agent/agentimpl/agent.go:176 in newLogsAgent) | logs-agent disabled
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:03:03 UTC | PROCESS | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/process/agent/agent_linux.go:61 in enabledHelper) | Process/Container Collection in the Process Agent will be deprecated in a future release and will instead be run in the Core Agent. Set process_config.run_in_core_agent.enabled to true to switch now.
2024-12-20 00:03:03 UTC | PROCESS | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-20 00:03:03 UTC | PROCESS | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-20 00:03:03 UTC | CORE | INFO | (comp/rdnsquerier/impl/rdnsquerier.go:83 in NewComponent) | Reverse DNS Enrichment config: (enabled=false workers=0 chan_size=0 cache.enabled=true cache.entry_ttl=0 cache.clean_interval=0 cache.persist_interval=0 cache.max_retries=-1 cache.max_size=0 rate_limiter.enabled=true rate_limiter.limit_per_sec=0 rate_limiter.limit_throttled_per_sec=0 rate_limiter.throttle_error_threshold=0 rate_limiter.recovery_intervals=0 rate_limiter.recovery_interval=0)
2024-12-20 00:03:03 UTC | CORE | INFO | (comp/netflow/server/server.go:63 in newServer) | Reverse DNS Enrichment is disabled for NDM NetFlow
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/collector/python/init.go:337 in resolvePythonExecPath) | Using '/opt/datadog-agent/embedded' as Python home
2024-12-20 00:03:03 UTC | CORE | INFO | (pkg/collector/python/init.go:403 in Initialize) | Initializing rtloader with Python 3 /opt/datadog-agent/embedded
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/process/apiserver/apiserver.go:53 in newApiServer) | API server listening on localhost:6162
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kube_metadata" could not start. error: component workloadmeta-kube_metadata is disabled: Agent is not running on Kubernetes
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "docker" could not start. error: component workloadmeta-docker is disabled: Agent is not running on Docker
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-container" could not start. error: component workloadmeta-cloudfoundry-container is disabled: Agent is not running on CloudFoundry
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "crio" could not start. error: component workloadmeta-crio is disabled: Crio not detected
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs_fargate" could not start. error: component workloadmeta-ecs_fargate is disabled: Agent is not running on ECS Fargate
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs" could not start. error: component workloadmeta-ecs is disabled: Agent is not running on ECS EC2
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "containerd" could not start. error: component workloadmeta-containerd is disabled: Agent is not running on containerd
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-vm" could not start. error: component workloadmeta-cloudfoundry-vm is disabled: Agent is not running on CloudFoundry
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "podman" could not start. error: component workloadmeta-podman is disabled: Podman not detected
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kubelet" could not start. error: component workloadmeta-kubelet is disabled: Agent is not running on Kubernetes
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process-events.datadoghq.com" (1 api key(s))
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-20 00:03:03 UTC | PROCESS | INFO | (comp/core/tagger/collectors/workloadmeta_main.go:153 in stream) | workloadmeta tagger collector started
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/process/runner/submitter.go:182 in printStartMessage) | Starting CheckSubmitter for host=fv-az915-747, endpoints=[https://process.datadoghq.com], events endpoints=[https://process-events.datadoghq.com]
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/process/runner/runner.go:286 in Run) | Starting process-agent with enabled checks=[process_discovery]
2024-12-20 00:03:03 UTC | PROCESS | INFO | (pkg/process/runner/runner.go:243 in logCheckDuration) | Finished process_discovery check #1 in 42.58111ms
2024-12-20 00:03:03 UTC | TRACE | INFO | (comp/core/tagger/impl-remote/remote.go:119 in func1) | remote tagger initialized successfully
2024-12-20 00:03:03 UTC | TRACE | INFO | ([email protected]+incompatible/retry.go:37 in RetryNotify) | unable to establish stream, will possibly retry: rpc error: code = Canceled desc = received context error while waiting for new LB policy update: context canceled
2024-12-20 00:03:03 UTC | TRACE | INFO | (comp/core/tagger/impl-remote/remote.go:122 in func2) | remote tagger stopped successfully
trace-agent exited with code 0, disabling
2024-12-20 00:03:03 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:572 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :46741: connect: connection refused"
2024-12-20 00:03:03 UTC | PROCESS | ERROR | (comp/forwarder/defaultforwarder/transaction/transaction.go:433 in internalProcess) | API Key invalid, dropping transaction for https://process.datadoghq.com/api/v1/discovery
2024-12-20 00:03:03 UTC | PROCESS | ERROR | (pkg/process/runner/runner.go:501 in readResponseStatuses) | [process_discovery] Invalid response from https://process.datadoghq.com: 403 -> <nil>
2024-12-20 00:03:03 UTC | PROCESS | ERROR | (comp/forwarder/defaultforwarder/transaction/transaction.go:433 in internalProcess) | API Key invalid, dropping transaction for https://process.datadoghq.com/api/v1/discovery
2024-12-20 00:03:03 UTC | PROCESS | ERROR | (pkg/process/runner/runner.go:501 in readResponseStatuses) | [process_discovery] Invalid response from https://process.datadoghq.com: 403 -> <nil>
2024-12-20 00:03:04 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:171 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :46741: connect: connection refused"
2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/python/datadog_agent.go:148 in LogMessage) | - | (ddyaml.py:142) | monkey patching yaml.load...
2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/python/datadog_agent.go:148 in LogMessage) | - | (ddyaml.py:146) | monkey patching yaml.load_all...
2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/python/datadog_agent.go:148 in LogMessage) | - | (ddyaml.py:150) | monkey patching yaml.dump_all... (affects all yaml dump operations)
2024-12-20 00:03:04 UTC | SYS-PROBE | INFO | (pkg/config/remote/client/client.go:434 in pollLoop) | retrying the first update of remote-config state (rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp [::1]:46741: connect: connection refused")
2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/embed_python.go:22 in InitPython) | Embedding Python 3.12.6 (main, Dec 19 2024, 08:06:36) [GCC 11.4.0]
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:03:04 UTC | CORE | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-20 00:03:04 UTC | CORE | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/config/remote/service/util.go:110 in func1) | Missing meta bucket
2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/config/remote/service/util.go:131 in openCacheDB) | Different agent version or API Key detected
2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/config/remote/service/util.go:48 in recreate) | Clear remote configuration database
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/gui/guiimpl/gui.go:109 in newGui) | GUI server port -1 specified: not starting the GUI.
2024-12-20 00:03:04 UTC | CORE | WARN | (pkg/config/model/viper.go:264 in checkKnownKey) | config key runtime_security_config.sbom.enabled is unknown
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/autodiscovery/listeners/types.go:84 in Register) | Service listener factory cloudfoundry_bbs does not exist.
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/autodiscovery/listeners/types.go:84 in Register) | Service listener factory kube_endpoints does not exist.
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/autodiscovery/listeners/types.go:84 in Register) | Service listener factory kube_services does not exist.
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/autodiscovery/providers/config_reader.go:172 in read) | Searching for configuration files at: /etc/datadog-agent/conf.d
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/tagger/collectors/workloadmeta_main.go:153 in stream) | workloadmeta tagger collector started
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kubelet" could not start. error: component workloadmeta-kubelet is disabled: Agent is not running on Kubernetes
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "crio" could not start. error: component workloadmeta-crio is disabled: Crio not detected
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "docker" could not start. error: component workloadmeta-docker is disabled: Agent is not running on Docker
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-container" could not start. error: component workloadmeta-cloudfoundry-container is disabled: Agent is not running on CloudFoundry
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-vm" could not start. error: component workloadmeta-cloudfoundry-vm is disabled: Agent is not running on CloudFoundry
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "containerd" could not start. error: component workloadmeta-containerd is disabled: Agent is not running on containerd
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs" could not start. error: component workloadmeta-ecs is disabled: Agent is not running on ECS EC2
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "local-process-collector" could not start. error: component workloadmeta-process is disabled: language detection or core agent process collection is disabled
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kube_metadata" could not start. error: component workloadmeta-kube_metadata is disabled: Agent is not running on Kubernetes
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "process-collector" could not start. error: collector process-collector is not enabled
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "podman" could not start. error: component workloadmeta-podman is disabled: Podman not detected
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs_fargate" could not start. error: component workloadmeta-ecs_fargate is disabled: Agent is not running on ECS Fargate
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/autodiscovery/providers/config_reader.go:172 in read) | Searching for configuration files at: /opt/datadog-agent/bin/agent/dist/conf.d
2024-12-20 00:03:04 UTC | CORE | WARN | (comp/core/autodiscovery/providers/config_reader.go:176 in read) | Skipping, open /opt/datadog-agent/bin/agent/dist/conf.d: no such file or directory
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/autodiscovery/providers/config_reader.go:172 in read) | Searching for configuration files at:
2024-12-20 00:03:04 UTC | CORE | WARN | (comp/core/autodiscovery/providers/config_reader.go:176 in read) | Skipping, open : no such file or directory
2024-12-20 00:03:04 UTC | CORE | ERROR | (pkg/config/autodiscovery/autodiscovery.go:81 in DiscoverComponentsFromConfig) | Error unmarshalling snmp listener config. Error: no config given for snmp_listener
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/autoconfig.go:529 in initListenerCandidates) | environment listener successfully started
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/autoconfig.go:529 in initListenerCandidates) | static config listener successfully started
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://7-62-0-app.agent.datadoghq.com" (1 api key(s))
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://orchestrator.datadoghq.com" (1 api key(s))
2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/runner/runner.go:100 in ensureMinWorkers) | Runner 1 added 4 workers (total: 4)
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process-events.datadoghq.com" (1 api key(s))
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-20 00:03:04 UTC | CORE | INFO | (comp/dogstatsd/listener…B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/dogstatsd/listeners/uds_datagram.go:65 in NewUDSDatagramListener) | dogstatsd-uds: /var/run/datadog/dsd.socket successfully initialized#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:150 in GetDogStatsDWorkerAndPipelineCount) | Dogstatsd configured to run with 2 workers and 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:150 in GetDogStatsDWorkerAndPipelineCount) | Dogstatsd configured to run with 2 workers and 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/dogstatsd/listeners/uds_datagram.go:79 in listen) | dogstatsd-uds: starting to listen on /var/run/datadog/dsd.socket#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/dogstatsd/listeners/udp.go:128 in listen) | dogstatsd-udp: starting to listen on 127.0.0.1:8125#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:150 in GetDogStatsDWorkerAndPipelineCount) | Dogstatsd configured to run with 2 workers and 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/remote-config/rcservice/rcserviceimpl/rcservice.go:115 in func1) | remote config service started#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/agenttelemetry/impl/agenttelemetry.go:521 in start) | Starting agent telemetry for 2 schedules and 4 profiles#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/config/remote/client/client.go:400 in pollLoop) | retrying the first update of remote-config state (rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp [::1]:46741: connect: connection refused")#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/api/api/apiimpl/server.go:31 in startServer) | Started HTTP server 'CMD API Server' on 127.0.0.1:46741#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (cmd/agent/subcommands/run/command.go:512 in startAgent) | Starting Datadog Agent v7.62.0-devel+git.480.f5d99e9#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (cmd/agent/subcommands/run/command.go:538 in startAgent) | Hostname is: fv-az915-747#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/util/installinfo/install_info.go:94 in logVersionHistoryToFile) | Cannot read file: /opt/datadog-agent/run/version-history.json, will create a new one. open /opt/datadog-agent/run/version-history.json: no such file or directory#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | WARN | (pkg/collector/python/check_context.go:54 in initializeCheckContext) | Log receiver not provided. Logs from integrations will not be collected.#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/config_poller.go:170 in collectOnce) | file provider: collected 65 new configurations, removed 0#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/autoconfig.go:409 in LoadAndRun) | Started config provider "file"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | ERROR | (pkg/collector/scheduler.go:213 in getChecks) | Unable to load a check from instance of config 'zabbix': Python Check Loader: unable to import module 'zabbix': No module named 'zabbix'; Core Check Loader: Check zabbix not found in Catalog#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check container_image with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check container_lifecycle with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check cpu with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:572 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :46741: connect: connection refused"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check disk:67cc0574430a16ba with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check file_handle with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check io with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check load with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check memory with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check network:4b0649b7e11f0772 with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/util/cloudproviders/cloudproviders.go:89 in GetCloudProviderNTPHosts) | Detected Azure cloud provider environment with NTP server(s) at ["time.windows.com"]#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/corechecks/net/ntp/ntp.go:131 in parse) | Using NTP servers: [ time.windows.com ]#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check ntp:3c427a42a70bbf8 with an interval of 15m0s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check service_discovery with an interval of 1m0s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check telemetry with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check uptime with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:171 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :46741: connect: connection refused"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | WARN | (comp/forwarder/defaultforwarder/forwarder_health.go:297 in checkValidAPIKey) | api_key '***************************aaaaa' for domain https://api.datadoghq.com is invalid#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | ERROR | (comp/forwarder/defaultforwarder/forwarder_health.go:148 in healthCheckLoop) | No valid api key found, reporting the forwarder as unhealthy.#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/transaction/transaction.go:454 in internalProcess) | Successfully posted payload to "https://7-62-0-app.agent.datadoghq.com/intake/" (202 Accepted), the agent will only log transaction success every 500 transactions#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/metadata/host/hostimpl/hosttags/tags.go:165 in Get) | Unable to get host tags from source: gce - using cached host tags#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:04 UTC | CORE | INFO | (comp/metadata/host/hostimpl/utils/host.go:105 in getNetworkMeta) | could not get network metadata: could not detect network ID#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | SYS-PROBE | INFO | (pkg/config/remote/client/client.go:434 in pollLoop) | retrying the first update of remote-config state (rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp [::1]:46741: connect: connection refused")#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | CORE | INFO | (pkg/util/containers/metrics/provider/registry.go:102 in collectorDiscovery) | Container metrics provider discovery process finished#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | PROCESS | INFO | (pkg/util/containers/metrics/provider/registry.go:102 in collectorDiscovery) | Container metrics provider discovery process finished#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:572 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :46741: connect: connection refused"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:40 in CheckStarted) | check:container_image | Running check...#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | CORE | INFO | (pkg/collector/corechecks/containerimage/check.go:136 in Run) | Starting long-running check "container_image"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:59 in CheckFinished) | check:container_image | Done running check#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | SECURITY | INFO | (pkg/security/utils/hostname.go:97 in GetHostnameWithContextAndFallback) | Hostname is: fv-az915-747#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | SECURITY | INFO | (subcommands/runtime/command.go:704 in StartRuntimeSecurity) | Datadog runtime security agent disabled by config#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | SECURITY | INFO | (pkg/security/utils/hostname.go:97 in GetHostnameWithContextAndFallback) | Hostname is: fv-az915-747#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | SECURITY | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | SECURITY | INFO | (subcommands/start/command.go:267 in RunAgent) | All security-agent components are deactivated, exiting#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | SECURITY | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:127 in Start) | remote workloadmeta initialized successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | SECURITY | INFO | (comp/core/workloadmeta/impl/store.go:554 in startCandidates) | workloadmeta collector "remote-workloadmeta" started successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | SECURITY | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:175 in func1) | workloadmeta stream established successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:40 in CheckStarted) | check:ntp | Running check...#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:40 in CheckStarted) | check:service_discovery | Running check...#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:59 in CheckFinished) | check:service_discovery | Done running check#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:03:05 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:59 in CheckFinished) | check:ntp | Done running check#x1B[0m
#x1B[1m#x1B[31mE grep: /etc/datadog-agent/system-probe.yaml: No such file or directory#x1B[0m
#x1B[1m#x1B[31mE grep: /etc/datadog-agent/system-probe.yaml: No such file or directory#x1B[0m
#x1B[1m#x1B[31mE #x1B[0m
#x1B[1m#x1B[31mE #x1B[0m
#x1B[1m#x1B[31mE Error: could not load zabbix:#x1B[0m
#x1B[1m#x1B[31mE * Python Check Loader: unable to import module 'zabbix': No module named 'zabbix'#x1B[0m
#x1B[1m#x1B[31mE * Core Check Loader: Check zabbix not found in Catalog#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m╭─#x1B[0m#x1B[31m────────────────────#x1B[0m#x1B[31m #x1B[0m#x1B[1;31mTraceback #x1B[0m#x1B[1;2;31m(most recent call last)#x1B[0m#x1B[31m #x1B[0m#x1B[31m─────────────────────#x1B[0m#x1B[31m─╮#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/ddev/cli#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/#x1B[0m#x1B[1;33m__init__.py#x1B[0m:#x1B[94m165#x1B[0m in #x1B[92mmain#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m162 #x1B[0m#x1B[2m│ #x1B[0mmanager.hook.register_commands() #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m163 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m164 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mtry#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m165 #x1B[2m│ │ #x1B[0m#x1B[94mreturn#x1B[0m ddev(prog_name=#x1B[33m'#x1B[0m#x1B[33mddev#x1B[0m#x1B[33m'#x1B[0m, windows_expand_args=#x1B[94mFalse#x1B[0m) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m166 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mexcept#x1B[0m #x1B[96mException#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m167 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[94mfrom#x1B[0m #x1B[4;96mrich#x1B[0m#x1B[4;96m.#x1B[0m#x1B[4;96mconsole#x1B[0m #x1B[94mimport#x1B[0m Console #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m168 #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mco#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mre.py#x1B[0m:#x1B[94m1157#x1B[0m in #x1B[92m__call__#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mco#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mre.py#x1B[0m:#x1B[94m1078#x1B[0m in #x1B[92mmain#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mco#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mre.py#x1B[0m:#x1B[94m1688#x1B[0m in #x1B[92minvoke#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mco#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mre.py#x1B[0m:#x1B[94m1688#x1B[0m in #x1B[92minvoke#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mco#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mre.py#x1B[0m:#x1B[94m1434#x1B[0m in #x1B[92minvoke#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mco#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mre.py#x1B[0m:#x1B[94m783#x1B[0m in #x1B[92minvoke#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mde#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mcorators.py#x1B[0m:#x1B[94m45#x1B[0m in #x1B[92mnew_func#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/ddev/cli#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/env/#x1B[0m#x1B[1;33magent.py#x1B[0m:#x1B[94m78#x1B[0m in #x1B[92magent#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m75 #x1B[0m#x1B[2m│ │ #x1B[0menv_data.config_file.replace(temp_config_file) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m76 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[94mtry#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m77 #x1B[0m#x1B[2m│ │ │ #x1B[0menv_data.write_config(config) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m78 #x1B[2m│ │ │ #x1B[0magent.invoke(full_args) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m79 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[94mfinally#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m80 #x1B[0m#x1B[2m│ │ │ #x1B[0m#x1B[1;4mtemp_config_file.replace(env_data.config_file)#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m81 #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/ddev/e2e#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/agent/#x1B[0m#x1B[1;33mdocker.py#x1B[0m:#x1B[94m301#x1B[0m in #x1B[92minvoke#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m298 #x1B[0m#x1B[2m│ │ │ #x1B[0m) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m299 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m300 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92minvoke#x1B[0m(#x1B[96mself#x1B[0m, args: #x1B[96mlist#x1B[0m[#x1B[96mstr#x1B[0m]) -> #x1B[94mNone#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m301 #x1B[2m│ │ #x1B[0m#x1B[1;4;96mself#x1B[0m#x1B[1;4m.run_command([#x1B[0m#x1B[1;4;33m'#x1B[0m#x1B[1;4;33magent#x1B[0m#x1B[1;4;33m'#x1B[0m#x1B[1;4m, *args])#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m302 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m303 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92mrun_command#x1B[0m(#x1B[96mself#x1B[0m, args: #x1B[96mlist#x1B[0m[#x1B[96mstr#x1B[0m]) -> #x1B[94mNone#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m304 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[96mself#x1B[0m._run_command(#x1B[96mself#x1B[0m._format_command([*args]), check=#x1B[94mTrue#x1B[0m) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/ddev/e2e#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/agent/#x1B[0m#x1B[1;33mdocker.py#x1B[0m:#x1B[94m304#x1B[0m in #x1B[92mrun_command#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m301 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[96mself#x1B[0m.run_command([#x1B[33m'#x1B[0m#x1B[33magent#x1B[0m#x1B[33m'#x1B[0m, *args]) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m302 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m303 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92mrun_command#x1B[0m(#x1B[96mself#x1B[0m, args: #x1B[96mlist#x1B[0m[#x1B[96mstr#x1B[0m]) -> #x1B[94mNone#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m304 #x1B[2m│ │ #x1B[0m#x1B[1;4;96mself#x1B[0m#x1B[1;4m._run_command(#x1B[0m#x1B[1;4;96mself#x1B[0m#x1B[1;4m._format_command([*args]), check=#x1B[0m#x1B[1;4;94mTrue#x1B[0m#x1B[1;4m)#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m305 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m306 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92menter_shell#x1B[0m(#x1B[96mself#x1B[0m) -> #x1B[94mNone#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m307 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[96mself#x1B[0m._run_command(#x1B[96mself#x1B[0m._format_command([#x1B[33m'#x1B[0m#x1B[33mcmd#x1B[0m#x1B[33m'#x1B[0m #x1B[94mif#x1B[0m #x1B[96mself#x1B[0m._is_wind #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/ddev/e2e#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/agent/#x1B[0m#x1B[1;33mdocker.py#x1B[0m:#x1B[94m100#x1B[0m in #x1B[92m_run_command#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 97 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 98 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92m_run_command#x1B[0m(#x1B[96mself#x1B[0m, command: #x1B[96mlist#x1B[0m[#x1B[96mstr#x1B[0m], **kwargs) -> subprocess #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 99 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[94mwith#x1B[0m EnvVars({#x1B[33m'#x1B[0m#x1B[33mDOCKER_CLI_HINTS#x1B[0m#x1B[33m'#x1B[0m: #x1B[33m'#x1B[0m#x1B[33mfalse#x1B[0m#x1B[33m'#x1B[0m}): #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m100 #x1B[2m│ │ │ #x1B[0m#x1B[94mreturn#x1B[0m #x1B[1;4;96mself#x1B[0m#x1B[1;4m.platform.run_command(command, **kwargs)#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m101 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m102 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92m_show_logs#x1B[0m(#x1B[96mself#x1B[0m) -> #x1B[94mNone#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m103 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[96mself#x1B[0m._run_command([#x1B[33m'#x1B[0m#x1B[33mdocker#x1B[0m#x1B[33m'#x1B[0m, #x1B[33m'#x1B[0m#x1B[33mlogs#x1B[0m#x1B[33m'#x1B[0m, #x1B[96mself#x1B[0m._container_name]) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/ddev/uti#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33mls/#x1B[0m#x1B[1;33mplatform.py#x1B[0m:#x1B[94m87#x1B[0m in #x1B[92mrun_command#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 84 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[94mif#x1B[0m #x1B[96mself#x1B[0m.displaying_status #x1B[95mand#x1B[0m #x1B[95mnot#x1B[0m kwargs.get(#x1B[33m'#x1B[0m#x1B[33mcapture_output#x1B[0m#x1B[33m'#x1B[0m) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 85 #x1B[0m#x1B[2m│ │ │ #x1B[0m#x1B[94mreturn#x1B[0m #x1B[96mself#x1B[0m._run_command_integrated(command, shell=shell, #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 86 #x1B[0m#x1B[2m│ │ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m 87 #x1B[2m│ │ #x1B[0m#x1B[94mreturn#x1B[0m #x1B[1;4;96mself#x1B[0m#x1B[1;4m.modules.subprocess.run(#x1B[0m#x1B[1;4;96mself#x1B[0m#x1B[1;4m.format_for_subprocess(#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 88 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 89 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92mcheck_command#x1B[0m(#x1B[96mself#x1B[0m, command: #x1B[96mstr#x1B[0m | #x1B[96mlist#x1B[0m[#x1B[96mstr#x1B[0m], shell=#x1B[94mFalse#x1B[0m, **k #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 90 #x1B[0m#x1B[2;90m│ │ #x1B[0m#x1B[33m"""#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/#x1B[0m#x1B[1;33msubprocess.py#x1B[0m:#x1B[94m571#x1B[0m in #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[92mrun#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 568 #x1B[0m#x1B[2m│ │ │ #x1B[0m#x1B[94mraise#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 569 #x1B[0m#x1B[2m│ │ #x1B[0mretcode = process.poll() #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 570 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[94mif#x1B[0m check #x1B[95mand#x1B[0m retcode: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m 571 #x1B[2m│ │ │ #x1B[0m#x1B[1;4;94mraise#x1B[0m#x1B[1;4m CalledProcessError(retcode, process.args,#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 572 #x1B[0m#x1B[1;2;4m│ │ │ │ │ │ │ │ │ #x1B[0m#x1B[1;4moutput=stdout, stderr=stderr)#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 573 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mreturn#x1B[0m CompletedProcess(process.args, retcode, stdout, stderr) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 574 #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m╰──────────────────────────────────────────────────────────────────────────────╯#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[1;91mCalledProcessError: #x1B[0mCommand #x1B[32m'#x1B[0m#x1B[32m[#x1B[0m#x1B[32m'#x1B[0mdocker', #x1B[32m'exec'#x1B[0m, #x1B[32m'dd_zabbix_py3.12'#x1B[0m, #x1B[32m'agent'#x1B[0m, #x1B[0m
#x1B[1m#x1B[31mE #x1B[32m'check'#x1B[0m, #x1B[32m'zabbix'#x1B[0m, #x1B[32m'--json'#x1B[0m#x1B[1m]#x1B[0m' returned non-zero exit status #x1B[1;36m255#x1B[0m.#x1B[0m
#x1B[1m#x1B[31mE Error: no valid check found#x1B[0m
#x1B[1m#x1B[31mE #x1B[0m
#x1B[1m#x1B[31mE Could not find valid check output#x1B[0m
Check warning on line 0 in neutrona.tests.test_neutrona
github-actions / Test Results
test_metrics (neutrona.tests.test_neutrona) failed
test-results/Neutrona/test-unit-py3.12.xml [took 0s]
Raw output
FileNotFoundError: [Errno 2] No such file or directory: 'docker-compose'
#x1B[1m#x1B[31mtests/test_neutrona.py#x1B[0m:38: in test_metrics
#x1B[0msubprocess.check_call(args + [#x1B[33m"#x1B[39;49;00m#x1B[33mup#x1B[39;49;00m#x1B[33m"#x1B[39;49;00m, #x1B[33m"#x1B[39;49;00m#x1B[33m-d#x1B[39;49;00m#x1B[33m"#x1B[39;49;00m])#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/subprocess.py#x1B[0m:408: in check_call
#x1B[0mretcode = call(*popenargs, **kwargs)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/subprocess.py#x1B[0m:389: in call
#x1B[0m#x1B[94mwith#x1B[39;49;00m Popen(*popenargs, **kwargs) #x1B[94mas#x1B[39;49;00m p:#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/subprocess.py#x1B[0m:1026: in __init__
#x1B[0m#x1B[96mself#x1B[39;49;00m._execute_child(args, executable, preexec_fn, close_fds,#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/subprocess.py#x1B[0m:1955: in _execute_child
#x1B[0m#x1B[94mraise#x1B[39;49;00m child_exception_type(errno_num, err_msg, err_filename)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE FileNotFoundError: [Errno 2] No such file or directory: 'docker-compose'#x1B[0m
Check warning on line 0 in neutrona.tests.test_neutrona_demo
github-actions / Test Results
test_metrics (neutrona.tests.test_neutrona_demo) failed
test-results/Neutrona/test-unit-py3.12.xml [took 0s]
Raw output
FileNotFoundError: [Errno 2] No such file or directory: 'docker-compose'
#x1B[1m#x1B[31mtests/test_neutrona_demo.py#x1B[0m:38: in test_metrics
#x1B[0msubprocess.check_call(args + [#x1B[33m"#x1B[39;49;00m#x1B[33mup#x1B[39;49;00m#x1B[33m"#x1B[39;49;00m, #x1B[33m"#x1B[39;49;00m#x1B[33m-d#x1B[39;49;00m#x1B[33m"#x1B[39;49;00m])#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/subprocess.py#x1B[0m:408: in check_call
#x1B[0mretcode = call(*popenargs, **kwargs)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/subprocess.py#x1B[0m:389: in call
#x1B[0m#x1B[94mwith#x1B[39;49;00m Popen(*popenargs, **kwargs) #x1B[94mas#x1B[39;49;00m p:#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/subprocess.py#x1B[0m:1026: in __init__
#x1B[0m#x1B[96mself#x1B[39;49;00m._execute_child(args, executable, preexec_fn, close_fds,#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/subprocess.py#x1B[0m:1955: in _execute_child
#x1B[0m#x1B[94mraise#x1B[39;49;00m child_exception_type(errno_num, err_msg, err_filename)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE FileNotFoundError: [Errno 2] No such file or directory: 'docker-compose'#x1B[0m
Check warning on line 0 in go_pprof_scraper.tests.test_go_pprof_scraper
github-actions / Test Results
1 out of 2 runs failed: test_e2e (go_pprof_scraper.tests.test_go_pprof_scraper)
test-results/go_pprof_scraper/test-e2e-py3.12.xml [took 1s]
Raw output
ValueError: [s6-init] making user provided files available at /var/run/s6/etc...exited 0.
[s6-init] ensuring user provided files have correct perms...exited 0.
[fix-attrs.d] applying ownership & permissions fixes...
[fix-attrs.d] done.
[cont-init.d] executing container initialization scripts...
[cont-init.d] 01-check-apikey.sh: executing...
[cont-init.d] 01-check-apikey.sh: exited 0.
[cont-init.d] 50-ci.sh: executing...
[cont-init.d] 50-ci.sh: exited 0.
[cont-init.d] 50-ecs.sh: executing...
[cont-init.d] 50-ecs.sh: exited 0.
[cont-init.d] 50-eks.sh: executing...
[cont-init.d] 50-eks.sh: exited 0.
[cont-init.d] 50-kubernetes.sh: executing...
[cont-init.d] 50-kubernetes.sh: exited 0.
[cont-init.d] 50-mesos.sh: executing...
[cont-init.d] 50-mesos.sh: exited 0.
[cont-init.d] 51-docker.sh: executing...
[cont-init.d] 51-docker.sh: exited 0.
[cont-init.d] 59-defaults.sh: executing...
[cont-init.d] 59-defaults.sh: exited 0.
[cont-init.d] 60-network-check.sh: executing...
[cont-init.d] 60-network-check.sh: exited 0.
[cont-init.d] 60-sysprobe-check.sh: executing...
[cont-init.d] 60-sysprobe-check.sh: exited 0.
[cont-init.d] 89-copy-customfiles.sh: executing...
[cont-init.d] 89-copy-customfiles.sh: exited 0.
[cont-init.d] done.
[services.d] starting services
starting trace-agent
starting system-probe
starting security-agent
starting agent
starting process-agent
2024-12-19 23:58:41 UTC | TRACE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-19 23:58:41 UTC | TRACE | INFO | (pkg/util/log/log.go:840 in func1) | Loading proxy settings
2024-12-19 23:58:41 UTC | TRACE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to resolve secrets
2024-12-19 23:58:41 UTC | TRACE | WARN | (pkg/util/log/log.go:885 in func1) | Agent configuration relax permissions constraint on the secret backend cmd, Group can read and exec
2024-12-19 23:58:41 UTC | TRACE | INFO | (pkg/util/log/log.go:840 in func1) | Finished resolving secrets
2024-12-19 23:58:41 UTC | TRACE | INFO | (pkg/util/log/log.go:845 in func1) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-19 23:58:41 UTC | TRACE | INFO | (pkg/util/log/log.go:845 in func1) | 0 Features detected from environment:
2024-12-19 23:58:41 UTC | TRACE | INFO | (comp/trace/config/setup.go:75 in LoadConfigFile) | Loaded configuration: /etc/datadog-agent/datadog.yaml
2024-12-19 23:58:41 UTC | TRACE | INFO | (comp/trace/config/setup.go:353 in applyDatadogConfig) | Activating non-local traffic automatically in containerized environment, trace-agent will listen on 0.0.0.0
2024-12-19 23:58:41 UTC | TRACE | INFO | (comp/trace/agent/impl/agent.go:107 in NewAgent) | trace-agent not enabled. Set the environment variable
DD_APM_ENABLED=true or add "apm_config.enabled: true" entry
to your datadog.yaml. Exiting...
[services.d] done.
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (pkg/config/setup/config.go:2051 in LoadCustom) | Starting to load the configuration
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (pkg/config/setup/config.go:1709 in LoadProxyFromEnv) | Loading proxy settings
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (pkg/config/env/environment_containers.go:242 in detectPodResources) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (pkg/config/env/environment_detection.go:124 in DetectFeatures) | 0 Features detected from environment:
2024-12-19 23:58:41 UTC | TRACE | INFO | (comp/core/tagger/impl-remote/remote.go:119 in func1) | remote tagger initialized successfully
2024-12-19 23:58:41 UTC | TRACE | INFO | ([email protected]+incompatible/retry.go:37 in RetryNotify) | unable to fetch auth token, will possibly retry: unable to read authentication token file: open /etc/datadog-agent/auth_token: no such file or directory
2024-12-19 23:58:41 UTC | TRACE | INFO | (comp/core/tagger/impl-remote/remote.go:122 in func2) | remote tagger stopped successfully
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (pkg/config/remote/client/client.go:400 in pollLoop) | retrying the first update of remote-config state (could not acquire agent auth token: unable to read authentication token file: open /etc/datadog-agent/auth_token: no such file or directory)
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:127 in Start) | remote workloadmeta initialized successfully
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/impl/store.go:554 in startCandidates) | workloadmeta collector "remote-workloadmeta" started successfully
2024-12-19 23:58:41 UTC | SYS-PROBE | WARN | (comp/core/workloadmeta/collectors/internal/remote/generic.go:154 in func1) | unable to establish entity stream between agents, will possibly retry: unable to fetch authentication token: unable to read authentication token file: open /etc/datadog-agent/auth_token: no such file or directory
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:199 in Start) | remote tagger initialized successfully
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (pkg/runtime/runtime.go:28 in func1) | runtime: set GOMAXPROCS to: 4
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:551 in func1) | unable to fetch auth token, will possibly retry: unable to read authentication token file: open /etc/datadog-agent/auth_token: no such file or directory
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (cmd/system-probe/subcommands/run/command.go:327 in startSystemProbe) | starting system-probe v7.62.0-devel+git.480.f5d99e9
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (cmd/system-probe/subcommands/run/command.go:439 in logUserAndGroupID) | current user id/name: 0/root
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (cmd/system-probe/subcommands/run/command.go:442 in logUserAndGroupID) | current group id/name: 0/root
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (cmd/system-probe/subcommands/run/command.go:332 in startSystemProbe) | system probe not enabled. exiting
trace-agent exited with code 0, disabling
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/util/log/log.go:845 in func1) | running on platform: ubuntu
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/util/log/log.go:845 in func1) | running version: 7.62.0-devel
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/util/log/log.go:840 in func1) | Loading proxy settings
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/util/log/log.go:840 in func1) | Starting to resolve secrets
2024-12-19 23:58:41 UTC | PROCESS | WARN | (pkg/util/log/log.go:885 in func1) | Agent configuration relax permissions constraint on the secret backend cmd, Group can read and exec
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/util/log/log.go:840 in func1) | Finished resolving secrets
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/util/log/log.go:845 in func1) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/util/log/log.go:845 in func1) | 0 Features detected from environment:
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/config/setup/config.go:2051 in LoadCustom) | Starting to load the configuration
2024-12-19 23:58:41 UTC | SECURITY | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-19 23:58:41 UTC | SECURITY | INFO | (pkg/util/log/log.go:840 in func1) | Loading proxy settings
2024-12-19 23:58:41 UTC | SECURITY | INFO | (pkg/util/log/log.go:840 in func1) | Starting to resolve secrets
2024-12-19 23:58:41 UTC | SECURITY | WARN | (pkg/util/log/log.go:885 in func1) | Agent configuration relax permissions constraint on the secret backend cmd, Group can read and exec
2024-12-19 23:58:41 UTC | SECURITY | INFO | (pkg/util/log/log.go:840 in func1) | Finished resolving secrets
2024-12-19 23:58:41 UTC | SECURITY | INFO | (pkg/util/log/log.go:845 in func1) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-19 23:58:41 UTC | SECURITY | INFO | (pkg/util/log/log.go:845 in func1) | 0 Features detected from environment:
2024-12-19 23:58:41 UTC | SECURITY | INFO | (pkg/config/setup/config.go:2051 in LoadCustom) | Starting to load the configuration
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/rdnsquerier/impl/rdnsquerier.go:83 in NewComponent) | Reverse DNS Enrichment config: (enabled=false workers=0 chan_size=0 cache.enabled=true cache.entry_ttl=0 cache.clean_interval=0 cache.persist_interval=0 cache.max_retries=-1 cache.max_size=0 rate_limiter.enabled=true rate_limiter.limit_per_sec=0 rate_limiter.limit_throttled_per_sec=0 rate_limiter.throttle_error_threshold=0 rate_limiter.recovery_intervals=0 rate_limiter.recovery_interval=0)
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/tagger/impl/tagger.go:124 in NewComponent) | TaggerClient is created, defaultTagger type: *taggerimpl.localTagger
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/util/containers/metrics/system/collector_linux.go:87 in newSystemCollector) | Unable to initialize cgroup provider (cgroups not mounted?), err: unable to detect cgroup version from detected mount points: map[]
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/process/metadata/workloadmeta/extractor.go:84 in NewWorkloadMetaExtractor) | Instantiating a new WorkloadMetaExtractor
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-19 23:58:41 UTC | PROCESS | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/process/agent/agent_linux.go:61 in enabledHelper) | Process/Container Collection in the Process Agent will be deprecated in a future release and will instead be run in the Core Agent. Set process_config.run_in_core_agent.enabled to true to switch now.
2024-12-19 23:58:41 UTC | PROCESS | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-19 23:58:41 UTC | PROCESS | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/process/apiserver/apiserver.go:53 in newApiServer) | API server listening on localhost:6162
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs_fargate" could not start. error: component workloadmeta-ecs_fargate is disabled: Agent is not running on ECS Fargate
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs" could not start. error: component workloadmeta-ecs is disabled: Agent is not running on ECS EC2
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "crio" could not start. error: component workloadmeta-crio is disabled: Crio not detected
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kubelet" could not start. error: component workloadmeta-kubelet is disabled: Agent is not running on Kubernetes
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "podman" could not start. error: component workloadmeta-podman is disabled: Podman not detected
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-container" could not start. error: component workloadmeta-cloudfoundry-container is disabled: Agent is not running on CloudFoundry
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-vm" could not start. error: component workloadmeta-cloudfoundry-vm is disabled: Agent is not running on CloudFoundry
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kube_metadata" could not start. error: component workloadmeta-kube_metadata is disabled: Agent is not running on Kubernetes
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "docker" could not start. error: component workloadmeta-docker is disabled: Agent is not running on Docker
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "containerd" could not start. error: component workloadmeta-containerd is disabled: Agent is not running on containerd
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/core/tagger/collectors/workloadmeta_main.go:153 in stream) | workloadmeta tagger collector started
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process-events.datadoghq.com" (1 api key(s))
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-19 23:58:41 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/process/runner/submitter.go:182 in printStartMessage) | Starting CheckSubmitter for host=fv-az1445-96, endpoints=[https://process.datadoghq.com], events endpoints=[https://process-events.datadoghq.com]
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/process/runner/runner.go:286 in Run) | Starting process-agent with enabled checks=[process_discovery]
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/util/log/log.go:840 in func1) | Loading proxy settings
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to resolve secrets
2024-12-19 23:58:41 UTC | CORE | WARN | (pkg/util/log/log.go:885 in func1) | Agent configuration relax permissions constraint on the secret backend cmd, Group can read and exec
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/util/log/log.go:840 in func1) | Finished resolving secrets
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/util/log/log.go:845 in func1) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/util/log/log.go:845 in func1) | 0 Features detected from environment:
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/runtime/runtime.go:28 in func1) | runtime: set GOMAXPROCS to: 4
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/process/metadata/workloadmeta/extractor.go:84 in NewWorkloadMetaExtractor) | Instantiating a new WorkloadMetaExtractor
2024-12-19 23:58:41 UTC | CORE | INFO | (comp/core/tagger/impl/tagger.go:124 in NewComponent) | TaggerClient is created, defaultTagger type: *taggerimpl.localTagger
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/util/containers/metrics/system/collector_linux.go:87 in newSystemCollector) | Unable to initialize cgroup provider (cgroups not mounted?), err: unable to detect cgroup version from detected mount points: map[]
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/config/utils/endpoints.go:35 in getResolvedDDUrl) | 'site' and 'dd_url' are both set in config: setting main endpoint to 'dd_url': "https://app.datadoghq.com"
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/config/utils/endpoints.go:35 in getResolvedDDUrl) | 'site' and 'dd_url' are both set in config: setting main endpoint to 'dd_url': "https://app.datadoghq.com"
2024-12-19 23:58:41 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-19 23:58:41 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:150 in GetDogStatsDWorkerAndPipelineCount) | Dogstatsd configured to run with 2 workers and 1 pipelines
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/aggregator/time_sampler.go:54 in NewTimeSampler) | Creating TimeSampler #0
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/config/setup/config.go:2051 in LoadCustom) | Starting to load the configuration
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/api/security/security.go:142 in fetchAuthToken) | [/omnibus/src/datadog-agent/src/github.com/DataDog/datadog-agent/pkg/api/util/util.go:107] Creating a new authentication token
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/api/security/security.go:241 in saveAuthToken) | Saving a new authentication token in /etc/datadog-agent/auth_token
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/api/security/security.go:256 in saveAuthToken) | Wrote auth token in /etc/datadog-agent/auth_token
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/api/security/security.go:155 in fetchAuthToken) | Saved a new authentication token to /etc/datadog-agent/auth_token
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/api/security/cert/cert_getter.go:59 in fetchAgentIPCCert) | [/omnibus/src/datadog-agent/src/github.com/DataDog/datadog-agent/pkg/api/util/util.go:111] Creating a new IPC certificate
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/api/security/cert/cert_getter.go:107 in saveIPCCertKey) | Saving a new IPC certificate/key pair in /etc/datadog-agent/ipc_cert.pem
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/api/security/cert/cert_getter.go:126 in saveIPCCertKey) | Wrote IPC certificate/key pair in /etc/datadog-agent/ipc_cert.pem
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/api/security/cert/cert_getter.go:73 in fetchAgentIPCCert) | Saved a new IPC certificate/key pair to /etc/datadog-agent/ipc_cert.pem
2024-12-19 23:58:41 UTC | CORE | INFO | (comp/logs/agent/agentimpl/agent.go:176 in newLogsAgent) | logs-agent disabled
2024-12-19 23:58:41 UTC | PROCESS | INFO | (pkg/process/runner/runner.go:243 in logCheckDuration) | Finished process_discovery check #1 in 22.274223ms
2024-12-19 23:58:41 UTC | CORE | INFO | (comp/rdnsquerier/impl/rdnsquerier.go:83 in NewComponent) | Reverse DNS Enrichment config: (enabled=false workers=0 chan_size=0 cache.enabled=true cache.entry_ttl=0 cache.clean_interval=0 cache.persist_interval=0 cache.max_retries=-1 cache.max_size=0 rate_limiter.enabled=true rate_limiter.limit_per_sec=0 rate_limiter.limit_throttled_per_sec=0 rate_limiter.throttle_error_threshold=0 rate_limiter.recovery_intervals=0 rate_limiter.recovery_interval=0)
2024-12-19 23:58:41 UTC | CORE | INFO | (comp/netflow/server/server.go:63 in newServer) | Reverse DNS Enrichment is disabled for NDM NetFlow
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/collector/python/init.go:337 in resolvePythonExecPath) | Using '/opt/datadog-agent/embedded' as Python home
2024-12-19 23:58:41 UTC | CORE | INFO | (pkg/collector/python/init.go:403 in Initialize) | Initializing rtloader with Python 3 /opt/datadog-agent/embedded
2024-12-19 23:58:41 UTC | PROCESS | ERROR | (comp/forwarder/defaultforwarder/transaction/transaction.go:433 in internalProcess) | API Key invalid, dropping transaction for https://process.datadoghq.com/api/v1/discovery
2024-12-19 23:58:41 UTC | PROCESS | ERROR | (pkg/process/runner/runner.go:501 in readResponseStatuses) | [process_discovery] Invalid response from https://process.datadoghq.com: 403 -> <nil>
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:171 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :53057: connect: connection refused"
2024-12-19 23:58:41 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:572 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :53057: connect: connection refused"
2024-12-19 23:58:42 UTC | SYS-PROBE | INFO | (pkg/config/remote/client/client.go:434 in pollLoop) | retrying the first update of remote-config state (rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp [::1]:53057: connect: connection refused")
2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/python/datadog_agent.go:148 in LogMessage) | - | (ddyaml.py:142) | monkey patching yaml.load...
2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/python/datadog_agent.go:148 in LogMessage) | - | (ddyaml.py:146) | monkey patching yaml.load_all...
2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/python/datadog_agent.go:148 in LogMessage) | - | (ddyaml.py:150) | monkey patching yaml.dump_all... (affects all yaml dump operations)
2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/embed_python.go:22 in InitPython) | Embedding Python 3.12.6 (main, Dec 19 2024, 08:06:36) [GCC 11.4.0]
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-19 23:58:42 UTC | CORE | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-19 23:58:42 UTC | CORE | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/config/remote/service/util.go:110 in func1) | Missing meta bucket
2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/config/remote/service/util.go:131 in openCacheDB) | Different agent version or API Key detected
2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/config/remote/service/util.go:48 in recreate) | Clear remote configuration database
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/gui/guiimpl/gui.go:109 in newGui) | GUI server port -1 specified: not starting the GUI.
2024-12-19 23:58:42 UTC | CORE | WARN | (pkg/config/model/viper.go:264 in checkKnownKey) | config key runtime_security_config.sbom.enabled is unknown
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "process-collector" could not start. error: collector process-collector is not enabled
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kubelet" could not start. error: component workloadmeta-kubelet is disabled: Agent is not running on Kubernetes
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "podman" could not start. error: component workloadmeta-podman is disabled: Podman not detected
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-vm" could not start. error: component workloadmeta-cloudfoundry-vm is disabled: Agent is not running on CloudFoundry
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "containerd" could not start. error: component workloadmeta-containerd is disabled: Agent is not running on containerd
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "docker" could not start. error: component workloadmeta-docker is disabled: Agent is not running on Docker
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kube_metadata" could not start. error: component workloadmeta-kube_metadata is disabled: Agent is not running on Kubernetes
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs" could not start. error: component workloadmeta-ecs is disabled: Agent is not running on ECS EC2
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs_fargate" could not start. error: component workloadmeta-ecs_fargate is disabled: Agent is not running on ECS Fargate
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-container" could not start. error: component workloadmeta-cloudfoundry-container is disabled: Agent is not running on CloudFoundry
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "crio" could not start. error: component workloadmeta-crio is disabled: Crio not detected
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "local-process-collector" could not start. error: component workloadmeta-process is disabled: language detection or core agent process collection is disabled
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/autodiscovery/listeners/types.go:84 in Register) | Service listener factory cloudfoundry_bbs does not exist.
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/autodiscovery/listeners/types.go:84 in Register) | Service listener factory kube_endpoints does not exist.
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/autodiscovery/listeners/types.go:84 in Register) | Service listener factory kube_services does not exist.
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/autodiscovery/providers/config_reader.go:172 in read) | Searching for configuration files at: /etc/datadog-agent/conf.d
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/tagger/collectors/workloadmeta_main.go:153 in stream) | workloadmeta tagger collector started
2024-12-19 23:58:42 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:171 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :53057: connect: connection refused"
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/autodiscovery/providers/config_reader.go:172 in read) | Searching for configuration files at: /opt/datadog-agent/bin/agent/dist/conf.d
2024-12-19 23:58:42 UTC | CORE | WARN | (comp/core/autodiscovery/providers/config_reader.go:176 in read) | Skipping, open /opt/datadog-agent/bin/agent/dist/conf.d: no such file or directory
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/autodiscovery/providers/config_reader.go:172 in read) | Searching for configuration files at:
2024-12-19 23:58:42 UTC | CORE | WARN | (comp/core/autodiscovery/providers/config_reader.go:176 in read) | Skipping, open : no such file or directory
2024-12-19 23:58:42 UTC | CORE | ERROR | (pkg/config/autodiscovery/autodiscovery.go:81 in DiscoverComponentsFromConfig) | Error unmarshalling snmp listener config. Error: no config given for snmp_listener
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/autoconfig.go:529 in initListenerCandidates) | static config listener successfully started
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/autoconfig.go:529 in initListenerCandidates) | environment listener successfully started
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://7-62-0-app.agent.datadoghq.com" (1 api key(s))
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://orchestrator.datadoghq.com" (1 api key(s))
2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/runner/runner.go:100 in ensureMinWorkers) | Runner 1 added 4 workers (total: 4)
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process-events.datadoghq.com" (1 api key(s))
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-19 23:58:42 UTC | CORE | INFO | (comp/dogstatsd/listeners/uds_datagram.go:65 in NewUDSDatagramListener) | dogstatsd-uds: /var/run/datadog/dsd.socket successfully …r started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (comp/dogstatsd/listeners/uds_datagram.go:65 in NewUDSDatagramListener) | dogstatsd-uds: /var/run/datadog/dsd.socket successfully initialized#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:150 in GetDogStatsDWorkerAndPipelineCount) | Dogstatsd configured to run with 2 workers and 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:150 in GetDogStatsDWorkerAndPipelineCount) | Dogstatsd configured to run with 2 workers and 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:150 in GetDogStatsDWorkerAndPipelineCount) | Dogstatsd configured to run with 2 workers and 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (comp/dogstatsd/listeners/udp.go:128 in listen) | dogstatsd-udp: starting to listen on 127.0.0.1:8125#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (comp/dogstatsd/listeners/uds_datagram.go:79 in listen) | dogstatsd-uds: starting to listen on /var/run/datadog/dsd.socket#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (comp/remote-config/rcservice/rcserviceimpl/rcservice.go:115 in func1) | remote config service started#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/agenttelemetry/impl/agenttelemetry.go:521 in start) | Starting agent telemetry for 2 schedules and 4 profiles#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (comp/api/api/apiimpl/server.go:31 in startServer) | Started HTTP server 'CMD API Server' on 127.0.0.1:53057#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (cmd/agent/subcommands/run/command.go:512 in startAgent) | Starting Datadog Agent v7.62.0-devel+git.480.f5d99e9#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (cmd/agent/subcommands/run/command.go:538 in startAgent) | Hostname is: fv-az1445-96#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/util/installinfo/install_info.go:94 in logVersionHistoryToFile) | Cannot read file: /opt/datadog-agent/run/version-history.json, will create a new one. open /opt/datadog-agent/run/version-history.json: no such file or directory#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | WARN | (pkg/collector/python/check_context.go:54 in initializeCheckContext) | Log receiver not provided. Logs from integrations will not be collected.#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/config_poller.go:170 in collectOnce) | file provider: collected 65 new configurations, removed 0#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | ERROR | (pkg/collector/scheduler.go:213 in getChecks) | Unable to load a check from instance of config 'go_pprof_scraper': Python Check Loader: unable to import module 'go_pprof_scraper': No module named 'go_pprof_scraper'; Core Check Loader: Check go_pprof_scraper not found in Catalog#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/config/remote/client/client.go:400 in pollLoop) | retrying the first update of remote-config state (rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp [::1]:53057: connect: connection refused")#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/autoconfig.go:409 in LoadAndRun) | Started config provider "file"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check container_image with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check container_lifecycle with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check cpu with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | WARN | (comp/forwarder/defaultforwarder/forwarder_health.go:297 in checkValidAPIKey) | api_key '***************************aaaaa' for domain https://api.datadoghq.com is invalid#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | ERROR | (comp/forwarder/defaultforwarder/forwarder_health.go:148 in healthCheckLoop) | No valid api key found, reporting the forwarder as unhealthy.#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check disk:67cc0574430a16ba with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check file_handle with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check io with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check load with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check memory with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/transaction/transaction.go:454 in internalProcess) | Successfully posted payload to "https://7-62-0-app.agent.datadoghq.com/intake/" (202 Accepted), the agent will only log transaction success every 500 transactions#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check network:4b0649b7e11f0772 with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/util/cloudproviders/cloudproviders.go:89 in GetCloudProviderNTPHosts) | Detected Azure cloud provider environment with NTP server(s) at ["time.windows.com"]#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/corechecks/net/ntp/ntp.go:131 in parse) | Using NTP servers: [ time.windows.com ]#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check ntp:3c427a42a70bbf8 with an interval of 15m0s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check service_discovery with an interval of 1m0s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check telemetry with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check uptime with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:42 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:572 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :53057: connect: connection refused"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | CORE | INFO | (comp/metadata/host/hostimpl/hosttags/tags.go:165 in Get) | Unable to get host tags from source: gce - using cached host tags#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | CORE | INFO | (comp/metadata/host/hostimpl/utils/host.go:105 in getNetworkMeta) | could not get network metadata: could not detect network ID#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | SYS-PROBE | INFO | (pkg/config/remote/client/client.go:434 in pollLoop) | retrying the first update of remote-config state (rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp [::1]:53057: connect: connection refused")#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | PROCESS | INFO | (pkg/util/containers/metrics/provider/registry.go:102 in collectorDiscovery) | Container metrics provider discovery process finished#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | CORE | INFO | (pkg/util/containers/metrics/provider/registry.go:102 in collectorDiscovery) | Container metrics provider discovery process finished#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:40 in CheckStarted) | check:container_image | Running check...#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | CORE | INFO | (pkg/collector/corechecks/containerimage/check.go:136 in Run) | Starting long-running check "container_image"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:59 in CheckFinished) | check:container_image | Done running check#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:171 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :53057: connect: connection refused"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:572 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :53057: connect: connection refused"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:40 in CheckStarted) | check:ntp | Running check...#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:40 in CheckStarted) | check:service_discovery | Running check...#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:59 in CheckFinished) | check:service_discovery | Done running check#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:59 in CheckFinished) | check:ntp | Done running check#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | SECURITY | INFO | (pkg/security/utils/hostname.go:97 in GetHostnameWithContextAndFallback) | Hostname is: fv-az1445-96#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | SECURITY | INFO | (subcommands/runtime/command.go:704 in StartRuntimeSecurity) | Datadog runtime security agent disabled by config#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | SECURITY | INFO | (pkg/security/utils/hostname.go:97 in GetHostnameWithContextAndFallback) | Hostname is: fv-az1445-96#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | SECURITY | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | SECURITY | INFO | (subcommands/start/command.go:267 in RunAgent) | All security-agent components are deactivated, exiting#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | SECURITY | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:127 in Start) | remote workloadmeta initialized successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | SECURITY | INFO | (comp/core/workloadmeta/impl/store.go:554 in startCandidates) | workloadmeta collector "remote-workloadmeta" started successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-19 23:58:43 UTC | SECURITY | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:175 in func1) | workloadmeta stream established successfully#x1B[0m
#x1B[1m#x1B[31mE grep: /etc/datadog-agent/system-probe.yaml: No such file or directory#x1B[0m
#x1B[1m#x1B[31mE grep: /etc/datadog-agent/system-probe.yaml: No such file or directory#x1B[0m
#x1B[1m#x1B[31mE #x1B[0m
#x1B[1m#x1B[31mE #x1B[0m
#x1B[1m#x1B[31mE Error: could not load go_pprof_scraper:#x1B[0m
#x1B[1m#x1B[31mE * Python Check Loader: unable to import module 'go_pprof_scraper': No module named 'go_pprof_scraper'#x1B[0m
#x1B[1m#x1B[31mE * Core Check Loader: Check go_pprof_scraper not found in Catalog#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m╭─#x1B[0m#x1B[31m────────────────────#x1B[0m#x1B[31m #x1B[0m#x1B[1;31mTraceback #x1B[0m#x1B[1;2;31m(most recent call last)#x1B[0m#x1B[31m #x1B[0m#x1B[31m─────────────────────#x1B[0m#x1B[31m─╮#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/ddev/cli#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/#x1B[0m#x1B[1;33m__init__.py#x1B[0m:#x1B[94m165#x1B[0m in #x1B[92mmain#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m162 #x1B[0m#x1B[2m│ #x1B[0mmanager.hook.register_commands() #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m163 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m164 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mtry#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m165 #x1B[2m│ │ #x1B[0m#x1B[94mreturn#x1B[0m ddev(prog_name=#x1B[33m'#x1B[0m#x1B[33mddev#x1B[0m#x1B[33m'#x1B[0m, windows_expand_args=#x1B[94mFalse#x1B[0m) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m166 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mexcept#x1B[0m #x1B[96mException#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m167 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[94mfrom#x1B[0m #x1B[4;96mrich#x1B[0m#x1B[4;96m.#x1B[0m#x1B[4;96mconsole#x1B[0m #x1B[94mimport#x1B[0m Console #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m168 #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mco#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mre.py#x1B[0m:#x1B[94m1157#x1B[0m in #x1B[92m__call__#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mco#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mre.py#x1B[0m:#x1B[94m1078#x1B[0m in #x1B[92mmain#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mco#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mre.py#x1B[0m:#x1B[94m1688#x1B[0m in #x1B[92minvoke#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mco#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mre.py#x1B[0m:#x1B[94m1688#x1B[0m in #x1B[92minvoke#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mco#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mre.py#x1B[0m:#x1B[94m1434#x1B[0m in #x1B[92minvoke#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mco#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mre.py#x1B[0m:#x1B[94m783#x1B[0m in #x1B[92minvoke#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/click/#x1B[0m#x1B[1;33mde#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[1;33mcorators.py#x1B[0m:#x1B[94m45#x1B[0m in #x1B[92mnew_func#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/ddev/cli#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/env/#x1B[0m#x1B[1;33magent.py#x1B[0m:#x1B[94m78#x1B[0m in #x1B[92magent#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m75 #x1B[0m#x1B[2m│ │ #x1B[0menv_data.config_file.replace(temp_config_file) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m76 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[94mtry#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m77 #x1B[0m#x1B[2m│ │ │ #x1B[0menv_data.write_config(config) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m78 #x1B[2m│ │ │ #x1B[0magent.invoke(full_args) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m79 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[94mfinally#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m80 #x1B[0m#x1B[2m│ │ │ #x1B[0m#x1B[1;4mtemp_config_file.replace(env_data.config_file)#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m81 #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/ddev/e2e#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/agent/#x1B[0m#x1B[1;33mdocker.py#x1B[0m:#x1B[94m301#x1B[0m in #x1B[92minvoke#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m298 #x1B[0m#x1B[2m│ │ │ #x1B[0m) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m299 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m300 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92minvoke#x1B[0m(#x1B[96mself#x1B[0m, args: #x1B[96mlist#x1B[0m[#x1B[96mstr#x1B[0m]) -> #x1B[94mNone#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m301 #x1B[2m│ │ #x1B[0m#x1B[1;4;96mself#x1B[0m#x1B[1;4m.run_command([#x1B[0m#x1B[1;4;33m'#x1B[0m#x1B[1;4;33magent#x1B[0m#x1B[1;4;33m'#x1B[0m#x1B[1;4m, *args])#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m302 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m303 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92mrun_command#x1B[0m(#x1B[96mself#x1B[0m, args: #x1B[96mlist#x1B[0m[#x1B[96mstr#x1B[0m]) -> #x1B[94mNone#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m304 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[96mself#x1B[0m._run_command(#x1B[96mself#x1B[0m._format_command([*args]), check=#x1B[94mTrue#x1B[0m) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/ddev/e2e#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/agent/#x1B[0m#x1B[1;33mdocker.py#x1B[0m:#x1B[94m304#x1B[0m in #x1B[92mrun_command#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m301 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[96mself#x1B[0m.run_command([#x1B[33m'#x1B[0m#x1B[33magent#x1B[0m#x1B[33m'#x1B[0m, *args]) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m302 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m303 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92mrun_command#x1B[0m(#x1B[96mself#x1B[0m, args: #x1B[96mlist#x1B[0m[#x1B[96mstr#x1B[0m]) -> #x1B[94mNone#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m304 #x1B[2m│ │ #x1B[0m#x1B[1;4;96mself#x1B[0m#x1B[1;4m._run_command(#x1B[0m#x1B[1;4;96mself#x1B[0m#x1B[1;4m._format_command([*args]), check=#x1B[0m#x1B[1;4;94mTrue#x1B[0m#x1B[1;4m)#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m305 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m306 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92menter_shell#x1B[0m(#x1B[96mself#x1B[0m) -> #x1B[94mNone#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m307 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[96mself#x1B[0m._run_command(#x1B[96mself#x1B[0m._format_command([#x1B[33m'#x1B[0m#x1B[33mcmd#x1B[0m#x1B[33m'#x1B[0m #x1B[94mif#x1B[0m #x1B[96mself#x1B[0m._is_wind #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/ddev/e2e#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/agent/#x1B[0m#x1B[1;33mdocker.py#x1B[0m:#x1B[94m100#x1B[0m in #x1B[92m_run_command#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 97 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 98 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92m_run_command#x1B[0m(#x1B[96mself#x1B[0m, command: #x1B[96mlist#x1B[0m[#x1B[96mstr#x1B[0m], **kwargs) -> subprocess #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 99 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[94mwith#x1B[0m EnvVars({#x1B[33m'#x1B[0m#x1B[33mDOCKER_CLI_HINTS#x1B[0m#x1B[33m'#x1B[0m: #x1B[33m'#x1B[0m#x1B[33mfalse#x1B[0m#x1B[33m'#x1B[0m}): #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m100 #x1B[2m│ │ │ #x1B[0m#x1B[94mreturn#x1B[0m #x1B[1;4;96mself#x1B[0m#x1B[1;4m.platform.run_command(command, **kwargs)#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m101 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m102 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92m_show_logs#x1B[0m(#x1B[96mself#x1B[0m) -> #x1B[94mNone#x1B[0m: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m103 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[96mself#x1B[0m._run_command([#x1B[33m'#x1B[0m#x1B[33mdocker#x1B[0m#x1B[33m'#x1B[0m, #x1B[33m'#x1B[0m#x1B[33mlogs#x1B[0m#x1B[33m'#x1B[0m, #x1B[96mself#x1B[0m._container_name]) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/site-packages/ddev/uti#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33mls/#x1B[0m#x1B[1;33mplatform.py#x1B[0m:#x1B[94m87#x1B[0m in #x1B[92mrun_command#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 84 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[94mif#x1B[0m #x1B[96mself#x1B[0m.displaying_status #x1B[95mand#x1B[0m #x1B[95mnot#x1B[0m kwargs.get(#x1B[33m'#x1B[0m#x1B[33mcapture_output#x1B[0m#x1B[33m'#x1B[0m) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 85 #x1B[0m#x1B[2m│ │ │ #x1B[0m#x1B[94mreturn#x1B[0m #x1B[96mself#x1B[0m._run_command_integrated(command, shell=shell, #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 86 #x1B[0m#x1B[2m│ │ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m 87 #x1B[2m│ │ #x1B[0m#x1B[94mreturn#x1B[0m #x1B[1;4;96mself#x1B[0m#x1B[1;4m.modules.subprocess.run(#x1B[0m#x1B[1;4;96mself#x1B[0m#x1B[1;4m.format_for_subprocess(#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 88 #x1B[0m#x1B[2m│ #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 89 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mdef#x1B[0m #x1B[92mcheck_command#x1B[0m(#x1B[96mself#x1B[0m, command: #x1B[96mstr#x1B[0m | #x1B[96mlist#x1B[0m[#x1B[96mstr#x1B[0m], shell=#x1B[94mFalse#x1B[0m, **k #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 90 #x1B[0m#x1B[2;90m│ │ #x1B[0m#x1B[33m"""#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2;33m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/#x1B[0m#x1B[1;33msubprocess.py#x1B[0m:#x1B[94m571#x1B[0m in #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[92mrun#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 568 #x1B[0m#x1B[2m│ │ │ #x1B[0m#x1B[94mraise#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 569 #x1B[0m#x1B[2m│ │ #x1B[0mretcode = process.poll() #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 570 #x1B[0m#x1B[2m│ │ #x1B[0m#x1B[94mif#x1B[0m check #x1B[95mand#x1B[0m retcode: #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[31m❱ #x1B[0m 571 #x1B[2m│ │ │ #x1B[0m#x1B[1;4;94mraise#x1B[0m#x1B[1;4m CalledProcessError(retcode, process.args,#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 572 #x1B[0m#x1B[1;2;4m│ │ │ │ │ │ │ │ │ #x1B[0m#x1B[1;4moutput=stdout, stderr=stderr)#x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 573 #x1B[0m#x1B[2m│ #x1B[0m#x1B[94mreturn#x1B[0m CompletedProcess(process.args, retcode, stdout, stderr) #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m│#x1B[0m #x1B[2m 574 #x1B[0m #x1B[31m│#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[31m╰──────────────────────────────────────────────────────────────────────────────╯#x1B[0m#x1B[0m
#x1B[1m#x1B[31mE #x1B[1;91mCalledProcessError: #x1B[0mCommand #x1B[32m'#x1B[0m#x1B[32m[#x1B[0m#x1B[32m'#x1B[0mdocker', #x1B[32m'exec'#x1B[0m, #x1B[32m'dd_go_pprof_scraper_py3.12'#x1B[0m, #x1B[0m
#x1B[1m#x1B[31mE #x1B[32m'agent'#x1B[0m, #x1B[32m'check'#x1B[0m, #x1B[32m'go_pprof_scraper'#x1B[0m, #x1B[32m'--json'#x1B[0m#x1B[1m]#x1B[0m' returned non-zero exit status #x1B[0m
#x1B[1m#x1B[31mE #x1B[1;36m255#x1B[0m.#x1B[0m
#x1B[1m#x1B[31mE Error: no valid check found#x1B[0m
#x1B[1m#x1B[31mE #x1B[0m
#x1B[1m#x1B[31mE Could not find valid check output#x1B[0m
Check warning on line 0 in trino.tests.test_e2e
github-actions / Test Results
1 out of 2 runs failed: test_e2e (trino.tests.test_e2e)
test-results/Trino/test-e2e-py3.12.xml [took 4s]
Raw output
AssertionError: Needed at least 1 candidates for 'trino.execution.cpu_input_byte_rate.all_time.avg', got 0
Expected:
MetricStub(name='trino.execution.cpu_input_byte_rate.all_time.avg', type=None, value=None, tags=None, hostname=None, device=None, flush_first_value=None)
Similar submitted:
Score Most similar
0.80 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.avg', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.80 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.avg', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.80 MetricStub(name='trino.execution.cpu_input_byte_rate.one_minute.count', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.80 MetricStub(name='trino.execution.cpu_input_byte_rate.one_minute.count', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.77 MetricStub(name='trino.execution.execution_time.all_time.avg', type=0, value=104.87757285714285, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.77 MetricStub(name='trino.execution.execution_time.all_time.avg', type=0, value=104.87757285714285, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.76 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.max', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.76 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.max', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.76 MetricStub(name='trino.execution.cpu_input_byte_rate.one_minute.total', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.76 MetricStub(name='trino.execution.cpu_input_byte_rate.one_minute.total', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.74 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.p95', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.74 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.p95', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.74 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.p75', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.74 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.p75', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
0.74 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.min', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)
assert False
#x1B[1m#x1B[31mtests/test_e2e.py#x1B[0m:21: in test_e2e
#x1B[0maggregator.assert_metric(metric)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-trino/yIIfj4lC/py3.12/lib/python3.12/site-packages/datadog_checks/base/stubs/aggregator.py#x1B[0m:370: in assert_metric
#x1B[0m#x1B[96mself#x1B[39;49;00m._assert(condition, msg=msg, expected_stub=expected_metric, submitted_elements=#x1B[96mself#x1B[39;49;00m._metrics)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-trino/yIIfj4lC/py3.12/lib/python3.12/site-packages/datadog_checks/base/stubs/aggregator.py#x1B[0m:412: in _assert
#x1B[0m#x1B[94massert#x1B[39;49;00m condition, new_msg#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE AssertionError: Needed at least 1 candidates for 'trino.execution.cpu_input_byte_rate.all_time.avg', got 0#x1B[0m
#x1B[1m#x1B[31mE Expected:#x1B[0m
#x1B[1m#x1B[31mE MetricStub(name='trino.execution.cpu_input_byte_rate.all_time.avg', type=None, value=None, tags=None, hostname=None, device=None, flush_first_value=None)#x1B[0m
#x1B[1m#x1B[31mE Similar submitted:#x1B[0m
#x1B[1m#x1B[31mE Score Most similar#x1B[0m
#x1B[1m#x1B[31mE 0.80 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.avg', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.80 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.avg', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.80 MetricStub(name='trino.execution.cpu_input_byte_rate.one_minute.count', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.80 MetricStub(name='trino.execution.cpu_input_byte_rate.one_minute.count', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.77 MetricStub(name='trino.execution.execution_time.all_time.avg', type=0, value=104.87757285714285, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.77 MetricStub(name='trino.execution.execution_time.all_time.avg', type=0, value=104.87757285714285, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.76 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.max', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.76 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.max', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.76 MetricStub(name='trino.execution.cpu_input_byte_rate.one_minute.total', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.76 MetricStub(name='trino.execution.cpu_input_byte_rate.one_minute.total', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.74 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.p95', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.74 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.p95', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.74 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.p75', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.74 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.p75', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.74 MetricStub(name='trino.execution.wall_input_bytes_rate.one_minute.min', type=0, value=0.0, tags=['dd.internal.jmx_check_name:trino', 'instance:trino-localhost-9080', 'jmx_domain:trino.execution', 'name:QueryManager'], hostname='default', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE assert False#x1B[0m
Check failure on line 0 in gnatsd_streaming.tests.test_gnatsd_streaming
github-actions / Test Results
test_metrics (gnatsd_streaming.tests.test_gnatsd_streaming) with error
test-results/Gnatsd Streaming/test-unit-py3.12.xml [took 2m 27s]
Raw output
failed on setup with "tenacity.RetryError: RetryError[<Future at 0x7f1a1e91bb00 state=finished raised RetryError>]"
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:478: in __call__
#x1B[0mresult = fn(*args, **kwargs)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:98: in set_up_with_retry
#x1B[0m#x1B[94mreturn#x1B[39;49;00m set_up()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:82: in set_up
#x1B[0mcondition()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/conditions.py#x1B[0m:171: in __call__
#x1B[0m#x1B[94mraise#x1B[39;49;00m RetryError(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE datadog_checks.dev.errors.RetryError: Command: ['docker', 'compose', '-f', '/home/runner/work/integrations-extras/integrations-extras/gnatsd_streaming/tests/docker/docker-compose.yml', 'logs']#x1B[0m
#x1B[1m#x1B[31mE Failed to match `1` of the patterns.#x1B[0m
#x1B[1m#x1B[31mE Provided patterns: - re.compile('test.channel3', re.MULTILINE)#x1B[0m
#x1B[1m#x1B[31mE Missing patterns: - re.compile('test.channel3', re.MULTILINE)#x1B[0m
#x1B[1m#x1B[31mE Exit code: 0#x1B[0m
#x1B[1m#x1B[31mE Captured Output: nats_secondary-1 | [1] 2024/12/19 23:59:32.426157 [INF] STREAM: Starting nats-streaming-server[test-cluster] version 0.25.6#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.426191 [INF] STREAM: ServerID: xb5JWq2BnDV5MEVEnXUqh9#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.426194 [INF] STREAM: Go version: go1.20.11#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.426196 [INF] STREAM: Git commit: [d1a98ca]#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427931 [INF] Starting nats-server#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427937 [INF] Version: 2.9.24#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427939 [INF] Git: [e43cfb4]#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427941 [INF] Cluster: xb5JWq2BnDV5MEVEnXUqjK#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427943 [INF] Name: NCZEFSW6II3TV7U3465PRVSHIMURVS3VLATGYQ2YVACXBOELHWUS4KYH#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427944 [INF] ID: NCZEFSW6II3TV7U3465PRVSHIMURVS3VLATGYQ2YVACXBOELHWUS4KYH#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428381 [INF] Starting http monitor on 0.0.0.0:8223#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428522 [INF] Listening for client connections on 0.0.0.0:4222#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428700 [INF] Server is ready#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428755 [INF] Cluster name is xb5JWq2BnDV5MEVEnXUqjK#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428763 [WRN] Cluster name was dynamically generated, consider setting one#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428793 [INF] Listening for route connections on 0.0.0.0:5222#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.430193 [INF] 172.19.0.2:5222 - rid:4 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.430467 [INF] 172.19.0.2:5222 - rid:4 - Router connection closed: Cluster Name Conflict#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.455914 [INF] STREAM: Starting in standby mode#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:33.253532 [INF] 172.19.0.2:54444 - rid:10 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:33.459158 [INF] 172.19.0.2:5222 - rid:11 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:33.459451 [INF] 172.19.0.2:5222 - rid:11 - Router connection closed: Duplicate Route#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/go-nats-streaming (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/go-nats (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/nkeys (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/curve25519": found meta tag get.metaImport{Prefix:"golang.org/x/crypto", VCS:"git", RepoRoot:"https://go.googlesource.com/crypto"} at //golang.org/x/crypto/curve25519?go-get=1#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/curve25519": verifying non-authoritative meta tag#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | golang.org/x/crypto (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package crypto/ecdh: unrecognized import path "crypto/ecdh": import path does not begin with hostname#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/nacl/box": found meta tag get.metaImport{Prefix:"golang.org/x/crypto", VCS:"git", RepoRoot:"https://go.googlesource.com/crypto"} at //golang.org/x/crypto/nacl/box?go-get=1#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/nacl/box": verifying non-authoritative meta tag#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/sys/cpu": found meta tag get.metaImport{Prefix:"golang.org/x/sys", VCS:"git", RepoRoot:"https://go.googlesource.com/sys"} at //golang.org/x/sys/cpu?go-get=1#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/sys/cpu": verifying non-authoritative meta tag#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | golang.org/x/sys (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/nuid (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/gogo/protobuf (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | src/golang.org/x/crypto/curve25519/curve25519.go:13:8: cannot find package "crypto/ecdh" in any of:#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /usr/local/go/src/crypto/ecdh (from $GOROOT)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /go/src/crypto/ecdh (from $GOPATH)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package command-line-arguments#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/go-nats#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/nkeys#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/nacl/box#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/blake2b#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | src/golang.org/x/crypto/curve25519/curve25519.go:13:8: cannot find package "crypto/ecdh" in any of:#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /usr/local/go/src/crypto/ecdh (from $GOROOT)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /go/src/crypto/ecdh (from $GOPATH)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package command-line-arguments#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/go-nats#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/nkeys#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/nacl/box#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/blake2b#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | src/golang.org/x/crypto/curve25519/curve25519.go:13:8: cannot find package "crypto/ecdh" in any of:#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /usr/local/go/src/crypto/ecdh (from $GOROOT)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /go/src/crypto/ecdh (from $GOPATH)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package command-line-arguments#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/go-nats#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/nkeys#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/nacl/box#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/blake2b#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231377 [INF] STREAM: Starting nats-streaming-server[test-cluster] version 0.25.6#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231408 [INF] STREAM: ServerID: b4I51r9LntRwxNM4g6iVVY#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231411 [INF] STREAM: Go version: go1.20.11#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231413 [INF] STREAM: Git commit: [d1a98ca]#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233505 [INF] Starting nats-server#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233512 [INF] Version: 2.9.24#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233515 [INF] Git: [e43cfb4]#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233518 [INF] Cluster: b4I51r9LntRwxNM4g6iVam#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233520 [INF] Name: NDCAIFADDVFAFIDG5Q2REWI7CBE2S6AS6ESXILK2CFKJYDGRKPAYOSFE#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233522 [INF] ID: NDCAIFADDVFAFIDG5Q2REWI7CBE2S6AS6ESXILK2CFKJYDGRKPAYOSFE#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233929 [INF] Starting http monitor on 0.0.0.0:8222#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233993 [INF] Listening for client connections on 0.0.0.0:4222#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234147 [INF] Server is ready#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234745 [INF] Cluster name is b4I51r9LntRwxNM4g6iVam#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234755 [WRN] Cluster name was dynamically generated, consider setting one#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234785 [INF] Listening for route connections on 0.0.0.0:5222#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.252174 [ERR] Error trying to connect to route (attempt 1): lookup for host "nats_secondary": lookup nats_secondary on 127.0.0.11:53: server misbehaving#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.261745 [INF] STREAM: Starting in standby mode#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.430128 [INF] 172.19.0.3:38008 - rid:9 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.430537 [INF] Cluster name updated to xb5JWq2BnDV5MEVEnXUqjK#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.430755 [INF] 172.19.0.3:38008 - rid:9 - Router connection closed: Client Closed#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.253687 [INF] 172.19.0.3:5222 - rid:10 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.459196 [INF] 172.19.0.3:38016 - rid:11 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.459551 [INF] 172.19.0.3:38016 - rid:11 - Router connection closed: Duplicate Route#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.590937 [INF] STREAM: Server is active#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.590965 [INF] STREAM: Recovering the state...#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.591932 [INF] STREAM: Recovered 0 channel(s)#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592228 [INF] STREAM: Message store is FILE#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592239 [INF] STREAM: Store location: /usr/share/nats/data#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592303 [INF] STREAM: ---------- Store Limits ----------#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592655 [INF] STREAM: Channels: 100 *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592698 [INF] STREAM: --------- Channels Limits --------#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592718 [INF] STREAM: Subscriptions: 1000 *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592726 [INF] STREAM: Messages : 1000000 *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592729 [INF] STREAM: Bytes : 976.56 MB *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592732 [INF] STREAM: Age : unlimited *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592734 [INF] STREAM: Inactivity : unlimited *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592736 [INF] STREAM: ----------------------------------#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592739 [INF] STREAM: Streaming Server is ready#x1B[0m
#x1B[1m#x1B[31mE time="2024-12-20T00:00:35Z" level=warning msg="/home/runner/work/integrations-extras/integrations-extras/gnatsd_streaming/tests/docker/docker-compose.yml: `version` is obsolete"#x1B[0m
#x1B[33mThe above exception was the direct cause of the following exception:#x1B[0m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/flaky/flaky_pytest_plugin.py#x1B[0m:146: in <lambda>
#x1B[0m#x1B[94mlambda#x1B[39;49;00m: ihook(item=item, **kwds), when=when, reraise=reraise#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mtests/conftest.py#x1B[0m:17: in dd_environment
#x1B[0m#x1B[94mwith#x1B[39;49;00m docker_run(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/contextlib.py#x1B[0m:137: in __enter__
#x1B[0m#x1B[94mreturn#x1B[39;49;00m #x1B[96mnext#x1B[39;49;00m(#x1B[96mself#x1B[39;49;00m.gen)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/docker.py#x1B[0m:220: in docker_run
#x1B[0m#x1B[94mwith#x1B[39;49;00m environment_run(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/contextlib.py#x1B[0m:137: in __enter__
#x1B[0m#x1B[94mreturn#x1B[39;49;00m #x1B[96mnext#x1B[39;49;00m(#x1B[96mself#x1B[39;49;00m.gen)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:110: in environment_run
#x1B[0mresult = set_up_func()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:336: in wrapped_f
#x1B[0m#x1B[94mreturn#x1B[39;49;00m copy(f, *args, **kw)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:475: in __call__
#x1B[0mdo = #x1B[96mself#x1B[39;49;00m.iter(retry_state=retry_state)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:376: in iter
#x1B[0mresult = action(retry_state)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:419: in exc_check
#x1B[0m#x1B[94mraise#x1B[39;49;00m retry_exc #x1B[94mfrom#x1B[39;49;00m #x1B[04m#x1B[96mfut#x1B[39;49;00m#x1B[04m#x1B[96m.#x1B[39;49;00m#x1B[04m#x1B[96mexception#x1B[39;49;00m()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE tenacity.RetryError: RetryError[<Future at 0x7f1a1e91bb00 state=finished raised RetryError>]#x1B[0m
Check failure on line 0 in gnatsd_streaming.tests.test_gnatsd_streaming
github-actions / Test Results
test_metric_tags (gnatsd_streaming.tests.test_gnatsd_streaming) with error
test-results/Gnatsd Streaming/test-unit-py3.12.xml [took 0s]
Raw output
failed on setup with "tenacity.RetryError: RetryError[<Future at 0x7f1a1e91bb00 state=finished raised RetryError>]"
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:478: in __call__
#x1B[0mresult = fn(*args, **kwargs)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:98: in set_up_with_retry
#x1B[0m#x1B[94mreturn#x1B[39;49;00m set_up()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:82: in set_up
#x1B[0mcondition()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/conditions.py#x1B[0m:171: in __call__
#x1B[0m#x1B[94mraise#x1B[39;49;00m RetryError(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE datadog_checks.dev.errors.RetryError: Command: ['docker', 'compose', '-f', '/home/runner/work/integrations-extras/integrations-extras/gnatsd_streaming/tests/docker/docker-compose.yml', 'logs']#x1B[0m
#x1B[1m#x1B[31mE Failed to match `1` of the patterns.#x1B[0m
#x1B[1m#x1B[31mE Provided patterns: - re.compile('test.channel3', re.MULTILINE)#x1B[0m
#x1B[1m#x1B[31mE Missing patterns: - re.compile('test.channel3', re.MULTILINE)#x1B[0m
#x1B[1m#x1B[31mE Exit code: 0#x1B[0m
#x1B[1m#x1B[31mE Captured Output: nats_secondary-1 | [1] 2024/12/19 23:59:32.426157 [INF] STREAM: Starting nats-streaming-server[test-cluster] version 0.25.6#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.426191 [INF] STREAM: ServerID: xb5JWq2BnDV5MEVEnXUqh9#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.426194 [INF] STREAM: Go version: go1.20.11#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.426196 [INF] STREAM: Git commit: [d1a98ca]#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427931 [INF] Starting nats-server#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427937 [INF] Version: 2.9.24#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427939 [INF] Git: [e43cfb4]#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427941 [INF] Cluster: xb5JWq2BnDV5MEVEnXUqjK#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427943 [INF] Name: NCZEFSW6II3TV7U3465PRVSHIMURVS3VLATGYQ2YVACXBOELHWUS4KYH#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427944 [INF] ID: NCZEFSW6II3TV7U3465PRVSHIMURVS3VLATGYQ2YVACXBOELHWUS4KYH#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428381 [INF] Starting http monitor on 0.0.0.0:8223#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428522 [INF] Listening for client connections on 0.0.0.0:4222#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428700 [INF] Server is ready#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428755 [INF] Cluster name is xb5JWq2BnDV5MEVEnXUqjK#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428763 [WRN] Cluster name was dynamically generated, consider setting one#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428793 [INF] Listening for route connections on 0.0.0.0:5222#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.430193 [INF] 172.19.0.2:5222 - rid:4 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.430467 [INF] 172.19.0.2:5222 - rid:4 - Router connection closed: Cluster Name Conflict#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.455914 [INF] STREAM: Starting in standby mode#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:33.253532 [INF] 172.19.0.2:54444 - rid:10 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:33.459158 [INF] 172.19.0.2:5222 - rid:11 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:33.459451 [INF] 172.19.0.2:5222 - rid:11 - Router connection closed: Duplicate Route#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/go-nats-streaming (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/go-nats (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/nkeys (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/curve25519": found meta tag get.metaImport{Prefix:"golang.org/x/crypto", VCS:"git", RepoRoot:"https://go.googlesource.com/crypto"} at //golang.org/x/crypto/curve25519?go-get=1#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/curve25519": verifying non-authoritative meta tag#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | golang.org/x/crypto (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package crypto/ecdh: unrecognized import path "crypto/ecdh": import path does not begin with hostname#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/nacl/box": found meta tag get.metaImport{Prefix:"golang.org/x/crypto", VCS:"git", RepoRoot:"https://go.googlesource.com/crypto"} at //golang.org/x/crypto/nacl/box?go-get=1#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/nacl/box": verifying non-authoritative meta tag#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/sys/cpu": found meta tag get.metaImport{Prefix:"golang.org/x/sys", VCS:"git", RepoRoot:"https://go.googlesource.com/sys"} at //golang.org/x/sys/cpu?go-get=1#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/sys/cpu": verifying non-authoritative meta tag#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | golang.org/x/sys (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/nuid (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/gogo/protobuf (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | src/golang.org/x/crypto/curve25519/curve25519.go:13:8: cannot find package "crypto/ecdh" in any of:#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /usr/local/go/src/crypto/ecdh (from $GOROOT)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /go/src/crypto/ecdh (from $GOPATH)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package command-line-arguments#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/go-nats#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/nkeys#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/nacl/box#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/blake2b#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | src/golang.org/x/crypto/curve25519/curve25519.go:13:8: cannot find package "crypto/ecdh" in any of:#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /usr/local/go/src/crypto/ecdh (from $GOROOT)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /go/src/crypto/ecdh (from $GOPATH)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package command-line-arguments#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/go-nats#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/nkeys#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/nacl/box#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/blake2b#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | src/golang.org/x/crypto/curve25519/curve25519.go:13:8: cannot find package "crypto/ecdh" in any of:#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /usr/local/go/src/crypto/ecdh (from $GOROOT)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /go/src/crypto/ecdh (from $GOPATH)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package command-line-arguments#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/go-nats#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/nkeys#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/nacl/box#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/blake2b#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231377 [INF] STREAM: Starting nats-streaming-server[test-cluster] version 0.25.6#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231408 [INF] STREAM: ServerID: b4I51r9LntRwxNM4g6iVVY#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231411 [INF] STREAM: Go version: go1.20.11#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231413 [INF] STREAM: Git commit: [d1a98ca]#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233505 [INF] Starting nats-server#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233512 [INF] Version: 2.9.24#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233515 [INF] Git: [e43cfb4]#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233518 [INF] Cluster: b4I51r9LntRwxNM4g6iVam#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233520 [INF] Name: NDCAIFADDVFAFIDG5Q2REWI7CBE2S6AS6ESXILK2CFKJYDGRKPAYOSFE#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233522 [INF] ID: NDCAIFADDVFAFIDG5Q2REWI7CBE2S6AS6ESXILK2CFKJYDGRKPAYOSFE#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233929 [INF] Starting http monitor on 0.0.0.0:8222#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233993 [INF] Listening for client connections on 0.0.0.0:4222#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234147 [INF] Server is ready#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234745 [INF] Cluster name is b4I51r9LntRwxNM4g6iVam#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234755 [WRN] Cluster name was dynamically generated, consider setting one#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234785 [INF] Listening for route connections on 0.0.0.0:5222#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.252174 [ERR] Error trying to connect to route (attempt 1): lookup for host "nats_secondary": lookup nats_secondary on 127.0.0.11:53: server misbehaving#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.261745 [INF] STREAM: Starting in standby mode#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.430128 [INF] 172.19.0.3:38008 - rid:9 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.430537 [INF] Cluster name updated to xb5JWq2BnDV5MEVEnXUqjK#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.430755 [INF] 172.19.0.3:38008 - rid:9 - Router connection closed: Client Closed#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.253687 [INF] 172.19.0.3:5222 - rid:10 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.459196 [INF] 172.19.0.3:38016 - rid:11 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.459551 [INF] 172.19.0.3:38016 - rid:11 - Router connection closed: Duplicate Route#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.590937 [INF] STREAM: Server is active#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.590965 [INF] STREAM: Recovering the state...#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.591932 [INF] STREAM: Recovered 0 channel(s)#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592228 [INF] STREAM: Message store is FILE#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592239 [INF] STREAM: Store location: /usr/share/nats/data#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592303 [INF] STREAM: ---------- Store Limits ----------#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592655 [INF] STREAM: Channels: 100 *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592698 [INF] STREAM: --------- Channels Limits --------#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592718 [INF] STREAM: Subscriptions: 1000 *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592726 [INF] STREAM: Messages : 1000000 *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592729 [INF] STREAM: Bytes : 976.56 MB *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592732 [INF] STREAM: Age : unlimited *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592734 [INF] STREAM: Inactivity : unlimited *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592736 [INF] STREAM: ----------------------------------#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592739 [INF] STREAM: Streaming Server is ready#x1B[0m
#x1B[1m#x1B[31mE time="2024-12-20T00:00:35Z" level=warning msg="/home/runner/work/integrations-extras/integrations-extras/gnatsd_streaming/tests/docker/docker-compose.yml: `version` is obsolete"#x1B[0m
#x1B[33mThe above exception was the direct cause of the following exception:#x1B[0m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/flaky/flaky_pytest_plugin.py#x1B[0m:146: in <lambda>
#x1B[0m#x1B[94mlambda#x1B[39;49;00m: ihook(item=item, **kwds), when=when, reraise=reraise#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/flaky/flaky_pytest_plugin.py#x1B[0m:146: in <lambda>
#x1B[0m#x1B[94mlambda#x1B[39;49;00m: ihook(item=item, **kwds), when=when, reraise=reraise#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mtests/conftest.py#x1B[0m:17: in dd_environment
#x1B[0m#x1B[94mwith#x1B[39;49;00m docker_run(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/contextlib.py#x1B[0m:137: in __enter__
#x1B[0m#x1B[94mreturn#x1B[39;49;00m #x1B[96mnext#x1B[39;49;00m(#x1B[96mself#x1B[39;49;00m.gen)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/docker.py#x1B[0m:220: in docker_run
#x1B[0m#x1B[94mwith#x1B[39;49;00m environment_run(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/contextlib.py#x1B[0m:137: in __enter__
#x1B[0m#x1B[94mreturn#x1B[39;49;00m #x1B[96mnext#x1B[39;49;00m(#x1B[96mself#x1B[39;49;00m.gen)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:110: in environment_run
#x1B[0mresult = set_up_func()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:336: in wrapped_f
#x1B[0m#x1B[94mreturn#x1B[39;49;00m copy(f, *args, **kw)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:475: in __call__
#x1B[0mdo = #x1B[96mself#x1B[39;49;00m.iter(retry_state=retry_state)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:376: in iter
#x1B[0mresult = action(retry_state)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:419: in exc_check
#x1B[0m#x1B[94mraise#x1B[39;49;00m retry_exc #x1B[94mfrom#x1B[39;49;00m #x1B[04m#x1B[96mfut#x1B[39;49;00m#x1B[04m#x1B[96m.#x1B[39;49;00m#x1B[04m#x1B[96mexception#x1B[39;49;00m()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE tenacity.RetryError: RetryError[<Future at 0x7f1a1e91bb00 state=finished raised RetryError>]#x1B[0m
Check failure on line 0 in gnatsd_streaming.tests.test_gnatsd_streaming
github-actions / Test Results
test_failover_event (gnatsd_streaming.tests.test_gnatsd_streaming) with error
test-results/Gnatsd Streaming/test-unit-py3.12.xml [took 0s]
Raw output
failed on setup with "tenacity.RetryError: RetryError[<Future at 0x7f1a1e91bb00 state=finished raised RetryError>]"
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:478: in __call__
#x1B[0mresult = fn(*args, **kwargs)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:98: in set_up_with_retry
#x1B[0m#x1B[94mreturn#x1B[39;49;00m set_up()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:82: in set_up
#x1B[0mcondition()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/conditions.py#x1B[0m:171: in __call__
#x1B[0m#x1B[94mraise#x1B[39;49;00m RetryError(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE datadog_checks.dev.errors.RetryError: Command: ['docker', 'compose', '-f', '/home/runner/work/integrations-extras/integrations-extras/gnatsd_streaming/tests/docker/docker-compose.yml', 'logs']#x1B[0m
#x1B[1m#x1B[31mE Failed to match `1` of the patterns.#x1B[0m
#x1B[1m#x1B[31mE Provided patterns: - re.compile('test.channel3', re.MULTILINE)#x1B[0m
#x1B[1m#x1B[31mE Missing patterns: - re.compile('test.channel3', re.MULTILINE)#x1B[0m
#x1B[1m#x1B[31mE Exit code: 0#x1B[0m
#x1B[1m#x1B[31mE Captured Output: nats_secondary-1 | [1] 2024/12/19 23:59:32.426157 [INF] STREAM: Starting nats-streaming-server[test-cluster] version 0.25.6#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.426191 [INF] STREAM: ServerID: xb5JWq2BnDV5MEVEnXUqh9#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.426194 [INF] STREAM: Go version: go1.20.11#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.426196 [INF] STREAM: Git commit: [d1a98ca]#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427931 [INF] Starting nats-server#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427937 [INF] Version: 2.9.24#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427939 [INF] Git: [e43cfb4]#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427941 [INF] Cluster: xb5JWq2BnDV5MEVEnXUqjK#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427943 [INF] Name: NCZEFSW6II3TV7U3465PRVSHIMURVS3VLATGYQ2YVACXBOELHWUS4KYH#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427944 [INF] ID: NCZEFSW6II3TV7U3465PRVSHIMURVS3VLATGYQ2YVACXBOELHWUS4KYH#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428381 [INF] Starting http monitor on 0.0.0.0:8223#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428522 [INF] Listening for client connections on 0.0.0.0:4222#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428700 [INF] Server is ready#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428755 [INF] Cluster name is xb5JWq2BnDV5MEVEnXUqjK#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428763 [WRN] Cluster name was dynamically generated, consider setting one#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428793 [INF] Listening for route connections on 0.0.0.0:5222#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.430193 [INF] 172.19.0.2:5222 - rid:4 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.430467 [INF] 172.19.0.2:5222 - rid:4 - Router connection closed: Cluster Name Conflict#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.455914 [INF] STREAM: Starting in standby mode#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:33.253532 [INF] 172.19.0.2:54444 - rid:10 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:33.459158 [INF] 172.19.0.2:5222 - rid:11 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:33.459451 [INF] 172.19.0.2:5222 - rid:11 - Router connection closed: Duplicate Route#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/go-nats-streaming (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/go-nats (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/nkeys (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/curve25519": found meta tag get.metaImport{Prefix:"golang.org/x/crypto", VCS:"git", RepoRoot:"https://go.googlesource.com/crypto"} at //golang.org/x/crypto/curve25519?go-get=1#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/curve25519": verifying non-authoritative meta tag#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | golang.org/x/crypto (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package crypto/ecdh: unrecognized import path "crypto/ecdh": import path does not begin with hostname#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/nacl/box": found meta tag get.metaImport{Prefix:"golang.org/x/crypto", VCS:"git", RepoRoot:"https://go.googlesource.com/crypto"} at //golang.org/x/crypto/nacl/box?go-get=1#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/nacl/box": verifying non-authoritative meta tag#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/sys/cpu": found meta tag get.metaImport{Prefix:"golang.org/x/sys", VCS:"git", RepoRoot:"https://go.googlesource.com/sys"} at //golang.org/x/sys/cpu?go-get=1#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/sys/cpu": verifying non-authoritative meta tag#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | golang.org/x/sys (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/nuid (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/gogo/protobuf (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | src/golang.org/x/crypto/curve25519/curve25519.go:13:8: cannot find package "crypto/ecdh" in any of:#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /usr/local/go/src/crypto/ecdh (from $GOROOT)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /go/src/crypto/ecdh (from $GOPATH)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package command-line-arguments#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/go-nats#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/nkeys#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/nacl/box#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/blake2b#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | src/golang.org/x/crypto/curve25519/curve25519.go:13:8: cannot find package "crypto/ecdh" in any of:#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /usr/local/go/src/crypto/ecdh (from $GOROOT)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /go/src/crypto/ecdh (from $GOPATH)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package command-line-arguments#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/go-nats#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/nkeys#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/nacl/box#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/blake2b#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | src/golang.org/x/crypto/curve25519/curve25519.go:13:8: cannot find package "crypto/ecdh" in any of:#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /usr/local/go/src/crypto/ecdh (from $GOROOT)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /go/src/crypto/ecdh (from $GOPATH)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package command-line-arguments#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/go-nats#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/nkeys#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/nacl/box#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/blake2b#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231377 [INF] STREAM: Starting nats-streaming-server[test-cluster] version 0.25.6#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231408 [INF] STREAM: ServerID: b4I51r9LntRwxNM4g6iVVY#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231411 [INF] STREAM: Go version: go1.20.11#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231413 [INF] STREAM: Git commit: [d1a98ca]#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233505 [INF] Starting nats-server#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233512 [INF] Version: 2.9.24#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233515 [INF] Git: [e43cfb4]#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233518 [INF] Cluster: b4I51r9LntRwxNM4g6iVam#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233520 [INF] Name: NDCAIFADDVFAFIDG5Q2REWI7CBE2S6AS6ESXILK2CFKJYDGRKPAYOSFE#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233522 [INF] ID: NDCAIFADDVFAFIDG5Q2REWI7CBE2S6AS6ESXILK2CFKJYDGRKPAYOSFE#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233929 [INF] Starting http monitor on 0.0.0.0:8222#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233993 [INF] Listening for client connections on 0.0.0.0:4222#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234147 [INF] Server is ready#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234745 [INF] Cluster name is b4I51r9LntRwxNM4g6iVam#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234755 [WRN] Cluster name was dynamically generated, consider setting one#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234785 [INF] Listening for route connections on 0.0.0.0:5222#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.252174 [ERR] Error trying to connect to route (attempt 1): lookup for host "nats_secondary": lookup nats_secondary on 127.0.0.11:53: server misbehaving#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.261745 [INF] STREAM: Starting in standby mode#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.430128 [INF] 172.19.0.3:38008 - rid:9 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.430537 [INF] Cluster name updated to xb5JWq2BnDV5MEVEnXUqjK#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.430755 [INF] 172.19.0.3:38008 - rid:9 - Router connection closed: Client Closed#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.253687 [INF] 172.19.0.3:5222 - rid:10 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.459196 [INF] 172.19.0.3:38016 - rid:11 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.459551 [INF] 172.19.0.3:38016 - rid:11 - Router connection closed: Duplicate Route#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.590937 [INF] STREAM: Server is active#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.590965 [INF] STREAM: Recovering the state...#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.591932 [INF] STREAM: Recovered 0 channel(s)#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592228 [INF] STREAM: Message store is FILE#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592239 [INF] STREAM: Store location: /usr/share/nats/data#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592303 [INF] STREAM: ---------- Store Limits ----------#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592655 [INF] STREAM: Channels: 100 *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592698 [INF] STREAM: --------- Channels Limits --------#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592718 [INF] STREAM: Subscriptions: 1000 *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592726 [INF] STREAM: Messages : 1000000 *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592729 [INF] STREAM: Bytes : 976.56 MB *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592732 [INF] STREAM: Age : unlimited *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592734 [INF] STREAM: Inactivity : unlimited *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592736 [INF] STREAM: ----------------------------------#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592739 [INF] STREAM: Streaming Server is ready#x1B[0m
#x1B[1m#x1B[31mE time="2024-12-20T00:00:35Z" level=warning msg="/home/runner/work/integrations-extras/integrations-extras/gnatsd_streaming/tests/docker/docker-compose.yml: `version` is obsolete"#x1B[0m
#x1B[33mThe above exception was the direct cause of the following exception:#x1B[0m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/flaky/flaky_pytest_plugin.py#x1B[0m:146: in <lambda>
#x1B[0m#x1B[94mlambda#x1B[39;49;00m: ihook(item=item, **kwds), when=when, reraise=reraise#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/flaky/flaky_pytest_plugin.py#x1B[0m:146: in <lambda>
#x1B[0m#x1B[94mlambda#x1B[39;49;00m: ihook(item=item, **kwds), when=when, reraise=reraise#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/flaky/flaky_pytest_plugin.py#x1B[0m:146: in <lambda>
#x1B[0m#x1B[94mlambda#x1B[39;49;00m: ihook(item=item, **kwds), when=when, reraise=reraise#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mtests/conftest.py#x1B[0m:17: in dd_environment
#x1B[0m#x1B[94mwith#x1B[39;49;00m docker_run(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/contextlib.py#x1B[0m:137: in __enter__
#x1B[0m#x1B[94mreturn#x1B[39;49;00m #x1B[96mnext#x1B[39;49;00m(#x1B[96mself#x1B[39;49;00m.gen)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/docker.py#x1B[0m:220: in docker_run
#x1B[0m#x1B[94mwith#x1B[39;49;00m environment_run(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/contextlib.py#x1B[0m:137: in __enter__
#x1B[0m#x1B[94mreturn#x1B[39;49;00m #x1B[96mnext#x1B[39;49;00m(#x1B[96mself#x1B[39;49;00m.gen)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:110: in environment_run
#x1B[0mresult = set_up_func()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:336: in wrapped_f
#x1B[0m#x1B[94mreturn#x1B[39;49;00m copy(f, *args, **kw)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:475: in __call__
#x1B[0mdo = #x1B[96mself#x1B[39;49;00m.iter(retry_state=retry_state)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:376: in iter
#x1B[0mresult = action(retry_state)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:419: in exc_check
#x1B[0m#x1B[94mraise#x1B[39;49;00m retry_exc #x1B[94mfrom#x1B[39;49;00m #x1B[04m#x1B[96mfut#x1B[39;49;00m#x1B[04m#x1B[96m.#x1B[39;49;00m#x1B[04m#x1B[96mexception#x1B[39;49;00m()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE tenacity.RetryError: RetryError[<Future at 0x7f1a1e91bb00 state=finished raised RetryError>]#x1B[0m
Check failure on line 0 in gnatsd_streaming.tests.test_gnatsd_streaming
github-actions / Test Results
test_deltas (gnatsd_streaming.tests.test_gnatsd_streaming) with error
test-results/Gnatsd Streaming/test-unit-py3.12.xml [took 0s]
Raw output
failed on setup with "tenacity.RetryError: RetryError[<Future at 0x7f1a1e91bb00 state=finished raised RetryError>]"
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:478: in __call__
#x1B[0mresult = fn(*args, **kwargs)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:98: in set_up_with_retry
#x1B[0m#x1B[94mreturn#x1B[39;49;00m set_up()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:82: in set_up
#x1B[0mcondition()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/conditions.py#x1B[0m:171: in __call__
#x1B[0m#x1B[94mraise#x1B[39;49;00m RetryError(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE datadog_checks.dev.errors.RetryError: Command: ['docker', 'compose', '-f', '/home/runner/work/integrations-extras/integrations-extras/gnatsd_streaming/tests/docker/docker-compose.yml', 'logs']#x1B[0m
#x1B[1m#x1B[31mE Failed to match `1` of the patterns.#x1B[0m
#x1B[1m#x1B[31mE Provided patterns: - re.compile('test.channel3', re.MULTILINE)#x1B[0m
#x1B[1m#x1B[31mE Missing patterns: - re.compile('test.channel3', re.MULTILINE)#x1B[0m
#x1B[1m#x1B[31mE Exit code: 0#x1B[0m
#x1B[1m#x1B[31mE Captured Output: nats_secondary-1 | [1] 2024/12/19 23:59:32.426157 [INF] STREAM: Starting nats-streaming-server[test-cluster] version 0.25.6#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.426191 [INF] STREAM: ServerID: xb5JWq2BnDV5MEVEnXUqh9#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.426194 [INF] STREAM: Go version: go1.20.11#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.426196 [INF] STREAM: Git commit: [d1a98ca]#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427931 [INF] Starting nats-server#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427937 [INF] Version: 2.9.24#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427939 [INF] Git: [e43cfb4]#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427941 [INF] Cluster: xb5JWq2BnDV5MEVEnXUqjK#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427943 [INF] Name: NCZEFSW6II3TV7U3465PRVSHIMURVS3VLATGYQ2YVACXBOELHWUS4KYH#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.427944 [INF] ID: NCZEFSW6II3TV7U3465PRVSHIMURVS3VLATGYQ2YVACXBOELHWUS4KYH#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428381 [INF] Starting http monitor on 0.0.0.0:8223#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428522 [INF] Listening for client connections on 0.0.0.0:4222#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428700 [INF] Server is ready#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428755 [INF] Cluster name is xb5JWq2BnDV5MEVEnXUqjK#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428763 [WRN] Cluster name was dynamically generated, consider setting one#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.428793 [INF] Listening for route connections on 0.0.0.0:5222#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.430193 [INF] 172.19.0.2:5222 - rid:4 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.430467 [INF] 172.19.0.2:5222 - rid:4 - Router connection closed: Cluster Name Conflict#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:32.455914 [INF] STREAM: Starting in standby mode#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:33.253532 [INF] 172.19.0.2:54444 - rid:10 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:33.459158 [INF] 172.19.0.2:5222 - rid:11 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_secondary-1 | [1] 2024/12/19 23:59:33.459451 [INF] 172.19.0.2:5222 - rid:11 - Router connection closed: Duplicate Route#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/go-nats-streaming (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/go-nats (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/nkeys (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/curve25519": found meta tag get.metaImport{Prefix:"golang.org/x/crypto", VCS:"git", RepoRoot:"https://go.googlesource.com/crypto"} at //golang.org/x/crypto/curve25519?go-get=1#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/curve25519": verifying non-authoritative meta tag#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | golang.org/x/crypto (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package crypto/ecdh: unrecognized import path "crypto/ecdh": import path does not begin with hostname#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/nacl/box": found meta tag get.metaImport{Prefix:"golang.org/x/crypto", VCS:"git", RepoRoot:"https://go.googlesource.com/crypto"} at //golang.org/x/crypto/nacl/box?go-get=1#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/crypto/nacl/box": verifying non-authoritative meta tag#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/sys/cpu": found meta tag get.metaImport{Prefix:"golang.org/x/sys", VCS:"git", RepoRoot:"https://go.googlesource.com/sys"} at //golang.org/x/sys/cpu?go-get=1#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | get "golang.org/x/sys/cpu": verifying non-authoritative meta tag#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | golang.org/x/sys (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/nats-io/nuid (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | github.com/gogo/protobuf (download)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | src/golang.org/x/crypto/curve25519/curve25519.go:13:8: cannot find package "crypto/ecdh" in any of:#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /usr/local/go/src/crypto/ecdh (from $GOROOT)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /go/src/crypto/ecdh (from $GOPATH)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package command-line-arguments#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/go-nats#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/nkeys#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/nacl/box#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/blake2b#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | src/golang.org/x/crypto/curve25519/curve25519.go:13:8: cannot find package "crypto/ecdh" in any of:#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /usr/local/go/src/crypto/ecdh (from $GOROOT)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /go/src/crypto/ecdh (from $GOPATH)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package command-line-arguments#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/go-nats#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/nkeys#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/nacl/box#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/blake2b#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | src/golang.org/x/crypto/curve25519/curve25519.go:13:8: cannot find package "crypto/ecdh" in any of:#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /usr/local/go/src/crypto/ecdh (from $GOROOT)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | /go/src/crypto/ecdh (from $GOPATH)#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | package command-line-arguments#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/go-nats#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports github.com/nats-io/nkeys#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/nacl/box#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/crypto/blake2b#x1B[0m
#x1B[1m#x1B[31mE nats_channels-1 | imports golang.org/x/sys/cpu: C source files not allowed when not using cgo or SWIG: cpu_gccgo_x86.c#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231377 [INF] STREAM: Starting nats-streaming-server[test-cluster] version 0.25.6#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231408 [INF] STREAM: ServerID: b4I51r9LntRwxNM4g6iVVY#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231411 [INF] STREAM: Go version: go1.20.11#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.231413 [INF] STREAM: Git commit: [d1a98ca]#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233505 [INF] Starting nats-server#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233512 [INF] Version: 2.9.24#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233515 [INF] Git: [e43cfb4]#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233518 [INF] Cluster: b4I51r9LntRwxNM4g6iVam#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233520 [INF] Name: NDCAIFADDVFAFIDG5Q2REWI7CBE2S6AS6ESXILK2CFKJYDGRKPAYOSFE#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233522 [INF] ID: NDCAIFADDVFAFIDG5Q2REWI7CBE2S6AS6ESXILK2CFKJYDGRKPAYOSFE#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233929 [INF] Starting http monitor on 0.0.0.0:8222#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.233993 [INF] Listening for client connections on 0.0.0.0:4222#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234147 [INF] Server is ready#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234745 [INF] Cluster name is b4I51r9LntRwxNM4g6iVam#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234755 [WRN] Cluster name was dynamically generated, consider setting one#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.234785 [INF] Listening for route connections on 0.0.0.0:5222#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.252174 [ERR] Error trying to connect to route (attempt 1): lookup for host "nats_secondary": lookup nats_secondary on 127.0.0.11:53: server misbehaving#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.261745 [INF] STREAM: Starting in standby mode#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.430128 [INF] 172.19.0.3:38008 - rid:9 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.430537 [INF] Cluster name updated to xb5JWq2BnDV5MEVEnXUqjK#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:32.430755 [INF] 172.19.0.3:38008 - rid:9 - Router connection closed: Client Closed#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.253687 [INF] 172.19.0.3:5222 - rid:10 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.459196 [INF] 172.19.0.3:38016 - rid:11 - Route connection created#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.459551 [INF] 172.19.0.3:38016 - rid:11 - Router connection closed: Duplicate Route#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.590937 [INF] STREAM: Server is active#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.590965 [INF] STREAM: Recovering the state...#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.591932 [INF] STREAM: Recovered 0 channel(s)#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592228 [INF] STREAM: Message store is FILE#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592239 [INF] STREAM: Store location: /usr/share/nats/data#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592303 [INF] STREAM: ---------- Store Limits ----------#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592655 [INF] STREAM: Channels: 100 *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592698 [INF] STREAM: --------- Channels Limits --------#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592718 [INF] STREAM: Subscriptions: 1000 *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592726 [INF] STREAM: Messages : 1000000 *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592729 [INF] STREAM: Bytes : 976.56 MB *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592732 [INF] STREAM: Age : unlimited *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592734 [INF] STREAM: Inactivity : unlimited *#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592736 [INF] STREAM: ----------------------------------#x1B[0m
#x1B[1m#x1B[31mE nats_primary-1 | [1] 2024/12/19 23:59:33.592739 [INF] STREAM: Streaming Server is ready#x1B[0m
#x1B[1m#x1B[31mE time="2024-12-20T00:00:35Z" level=warning msg="/home/runner/work/integrations-extras/integrations-extras/gnatsd_streaming/tests/docker/docker-compose.yml: `version` is obsolete"#x1B[0m
#x1B[33mThe above exception was the direct cause of the following exception:#x1B[0m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/flaky/flaky_pytest_plugin.py#x1B[0m:146: in <lambda>
#x1B[0m#x1B[94mlambda#x1B[39;49;00m: ihook(item=item, **kwds), when=when, reraise=reraise#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/flaky/flaky_pytest_plugin.py#x1B[0m:146: in <lambda>
#x1B[0m#x1B[94mlambda#x1B[39;49;00m: ihook(item=item, **kwds), when=when, reraise=reraise#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/flaky/flaky_pytest_plugin.py#x1B[0m:146: in <lambda>
#x1B[0m#x1B[94mlambda#x1B[39;49;00m: ihook(item=item, **kwds), when=when, reraise=reraise#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/flaky/flaky_pytest_plugin.py#x1B[0m:146: in <lambda>
#x1B[0m#x1B[94mlambda#x1B[39;49;00m: ihook(item=item, **kwds), when=when, reraise=reraise#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mtests/conftest.py#x1B[0m:17: in dd_environment
#x1B[0m#x1B[94mwith#x1B[39;49;00m docker_run(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/contextlib.py#x1B[0m:137: in __enter__
#x1B[0m#x1B[94mreturn#x1B[39;49;00m #x1B[96mnext#x1B[39;49;00m(#x1B[96mself#x1B[39;49;00m.gen)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/docker.py#x1B[0m:220: in docker_run
#x1B[0m#x1B[94mwith#x1B[39;49;00m environment_run(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m/opt/hostedtoolcache/Python/3.12.8/x64/lib/python3.12/contextlib.py#x1B[0m:137: in __enter__
#x1B[0m#x1B[94mreturn#x1B[39;49;00m #x1B[96mnext#x1B[39;49;00m(#x1B[96mself#x1B[39;49;00m.gen)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/datadog_checks/dev/env.py#x1B[0m:110: in environment_run
#x1B[0mresult = set_up_func()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:336: in wrapped_f
#x1B[0m#x1B[94mreturn#x1B[39;49;00m copy(f, *args, **kw)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:475: in __call__
#x1B[0mdo = #x1B[96mself#x1B[39;49;00m.iter(retry_state=retry_state)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:376: in iter
#x1B[0mresult = action(retry_state)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd-streaming/bsJWGjNW/py3.12/lib/python3.12/site-packages/tenacity/__init__.py#x1B[0m:419: in exc_check
#x1B[0m#x1B[94mraise#x1B[39;49;00m retry_exc #x1B[94mfrom#x1B[39;49;00m #x1B[04m#x1B[96mfut#x1B[39;49;00m#x1B[04m#x1B[96m.#x1B[39;49;00m#x1B[04m#x1B[96mexception#x1B[39;49;00m()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE tenacity.RetryError: RetryError[<Future at 0x7f1a1e91bb00 state=finished raised RetryError>]#x1B[0m
Check warning on line 0 in vespa.tests.test_vespa
github-actions / Test Results
test_check_metrics (vespa.tests.test_vespa) failed
test-results/Vespa/test-unit-py3.12.xml [took 0s]
Raw output
AssertionError: Needed exactly 7 candidates for 'vespa.process_health', got 1
Expected:
ServiceCheckStub(check_id=None, name='vespa.process_health', status=0, tags=None, hostname=None, message=None)
Similar submitted:
Score Most similar
1.00 ServiceCheckStub(check_id='', name='vespa.process_health', status=0, tags=['instance:logd', 'metrictype:system', 'tag1:val1', 'vespa-service:vespa.logd', 'vespaVersion:7.0.0'], hostname='', message='')
0.75 ServiceCheckStub(check_id='', name='vespa.process_health', status=1, tags=['tag1:val1'], hostname='', message="Problem getting metrics from Vespa's node metrics api")
0.60 ServiceCheckStub(check_id='', name='vespa.metrics_health', status=1, tags=['tag1:val1'], hostname='', message='Unexpected error: Metric vespa.process_health was submitted with a forbidden tag: clustername. Please rename this tag, or skip the tag validation with DDEV_SKIP_GENERIC_TAGS_CHECK environment variable.')
assert False
#x1B[1m#x1B[31mtests/test_vespa.py#x1B[0m:97: in test_check_metrics
#x1B[0maggregator.assert_service_check(check.PROCESS_SERVICE_CHECK, VespaCheck.OK, count=#x1B[94m7#x1B[39;49;00m)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-vespa/LDWuwpyc/py3.12/lib/python3.12/site-packages/datadog_checks/base/stubs/aggregator.py#x1B[0m:403: in assert_service_check
#x1B[0m#x1B[96mself#x1B[39;49;00m._assert(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-vespa/LDWuwpyc/py3.12/lib/python3.12/site-packages/datadog_checks/base/stubs/aggregator.py#x1B[0m:412: in _assert
#x1B[0m#x1B[94massert#x1B[39;49;00m condition, new_msg#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE AssertionError: Needed exactly 7 candidates for 'vespa.process_health', got 1#x1B[0m
#x1B[1m#x1B[31mE Expected:#x1B[0m
#x1B[1m#x1B[31mE ServiceCheckStub(check_id=None, name='vespa.process_health', status=0, tags=None, hostname=None, message=None)#x1B[0m
#x1B[1m#x1B[31mE Similar submitted:#x1B[0m
#x1B[1m#x1B[31mE Score Most similar#x1B[0m
#x1B[1m#x1B[31mE 1.00 ServiceCheckStub(check_id='', name='vespa.process_health', status=0, tags=['instance:logd', 'metrictype:system', 'tag1:val1', 'vespa-service:vespa.logd', 'vespaVersion:7.0.0'], hostname='', message='')#x1B[0m
#x1B[1m#x1B[31mE 0.75 ServiceCheckStub(check_id='', name='vespa.process_health', status=1, tags=['tag1:val1'], hostname='', message="Problem getting metrics from Vespa's node metrics api")#x1B[0m
#x1B[1m#x1B[31mE 0.60 ServiceCheckStub(check_id='', name='vespa.metrics_health', status=1, tags=['tag1:val1'], hostname='', message='Unexpected error: Metric vespa.process_health was submitted with a forbidden tag: clustername. Please rename this tag, or skip the tag validation with DDEV_SKIP_GENERIC_TAGS_CHECK environment variable.')#x1B[0m
#x1B[1m#x1B[31mE assert False#x1B[0m
Check warning on line 0 in vespa.tests.test_vespa
github-actions / Test Results
test_check_counters_are_reset_between_check_calls (vespa.tests.test_vespa) failed
test-results/Vespa/test-unit-py3.12.xml [took 0s]
Raw output
assert 38 == 3
+ where 3 = <datadog_checks.vespa.vespa.VespaCheck object at 0x7f8981d6c4d0>.metric_count
#x1B[1m#x1B[31mtests/test_vespa.py#x1B[0m:125: in test_check_counters_are_reset_between_check_calls
#x1B[0massert_number_of_metrics_and_services(check)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mtests/test_vespa.py#x1B[0m:129: in assert_number_of_metrics_and_services
#x1B[0m#x1B[94massert#x1B[39;49;00m #x1B[94m38#x1B[39;49;00m == check.metric_count#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE assert 38 == 3#x1B[0m
#x1B[1m#x1B[31mE + where 3 = <datadog_checks.vespa.vespa.VespaCheck object at 0x7f8981d6c4d0>.metric_count#x1B[0m
Check warning on line 0 in gatekeeper.tests.test_e2e
github-actions / Test Results
1 out of 2 runs failed: test_check_ok (gatekeeper.tests.test_e2e)
test-results/Gatekeeper/test-e2e-py3.12.xml [took 1s]
Raw output
ValueError: [s6-init] making user provided files available at /var/run/s6/etc...exited 0.
[s6-init] ensuring user provided files have correct perms...exited 0.
[fix-attrs.d] applying ownership & permissions fixes...
[fix-attrs.d] done.
[cont-init.d] executing container initialization scripts...
[cont-init.d] 01-check-apikey.sh: executing...
[cont-init.d] 01-check-apikey.sh: exited 0.
[cont-init.d] 50-ci.sh: executing...
[cont-init.d] 50-ci.sh: exited 0.
[cont-init.d] 50-ecs.sh: executing...
[cont-init.d] 50-ecs.sh: exited 0.
[cont-init.d] 50-eks.sh: executing...
[cont-init.d] 50-eks.sh: exited 0.
[cont-init.d] 50-kubernetes.sh: executing...
[cont-init.d] 50-kubernetes.sh: exited 0.
[cont-init.d] 50-mesos.sh: executing...
[cont-init.d] 50-mesos.sh: exited 0.
[cont-init.d] 51-docker.sh: executing...
[cont-init.d] 51-docker.sh: exited 0.
[cont-init.d] 59-defaults.sh: executing...
[cont-init.d] 59-defaults.sh: exited 0.
[cont-init.d] 60-network-check.sh: executing...
[cont-init.d] 60-network-check.sh: exited 0.
[cont-init.d] 60-sysprobe-check.sh: executing...
[cont-init.d] 60-sysprobe-check.sh: exited 0.
[cont-init.d] 89-copy-customfiles.sh: executing...
[cont-init.d] 89-copy-customfiles.sh: exited 0.
[cont-init.d] done.
[services.d] starting services
starting security-agent
starting trace-agent
starting process-agent
starting agent
starting system-probe
[services.d] done.
2024-12-20 00:05:03 UTC | SECURITY | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-20 00:05:03 UTC | SECURITY | INFO | (pkg/util/log/log.go:840 in func1) | Loading proxy settings
2024-12-20 00:05:03 UTC | SECURITY | INFO | (pkg/util/log/log.go:840 in func1) | Starting to resolve secrets
2024-12-20 00:05:03 UTC | SECURITY | WARN | (pkg/util/log/log.go:885 in func1) | Agent configuration relax permissions constraint on the secret backend cmd, Group can read and exec
2024-12-20 00:05:03 UTC | SECURITY | INFO | (pkg/util/log/log.go:840 in func1) | Finished resolving secrets
2024-12-20 00:05:03 UTC | SECURITY | INFO | (pkg/util/log/log.go:845 in func1) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-20 00:05:03 UTC | SECURITY | INFO | (pkg/util/log/log.go:845 in func1) | 0 Features detected from environment:
2024-12-20 00:05:03 UTC | SECURITY | INFO | (pkg/config/setup/config.go:2051 in LoadCustom) | Starting to load the configuration
2024-12-20 00:05:03 UTC | TRACE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-20 00:05:03 UTC | TRACE | INFO | (pkg/util/log/log.go:840 in func1) | Loading proxy settings
2024-12-20 00:05:03 UTC | TRACE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to resolve secrets
2024-12-20 00:05:03 UTC | TRACE | WARN | (pkg/util/log/log.go:885 in func1) | Agent configuration relax permissions constraint on the secret backend cmd, Group can read and exec
2024-12-20 00:05:03 UTC | TRACE | INFO | (pkg/util/log/log.go:840 in func1) | Finished resolving secrets
2024-12-20 00:05:03 UTC | TRACE | INFO | (pkg/util/log/log.go:845 in func1) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-20 00:05:03 UTC | TRACE | INFO | (pkg/util/log/log.go:845 in func1) | 0 Features detected from environment:
2024-12-20 00:05:03 UTC | TRACE | INFO | (comp/trace/config/setup.go:75 in LoadConfigFile) | Loaded configuration: /etc/datadog-agent/datadog.yaml
2024-12-20 00:05:03 UTC | TRACE | INFO | (comp/trace/config/setup.go:353 in applyDatadogConfig) | Activating non-local traffic automatically in containerized environment, trace-agent will listen on 0.0.0.0
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/util/log/log.go:840 in func1) | Loading proxy settings
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to resolve secrets
2024-12-20 00:05:03 UTC | CORE | WARN | (pkg/util/log/log.go:885 in func1) | Agent configuration relax permissions constraint on the secret backend cmd, Group can read and exec
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/util/log/log.go:840 in func1) | Finished resolving secrets
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/util/log/log.go:845 in func1) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/util/log/log.go:845 in func1) | 0 Features detected from environment:
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/runtime/runtime.go:28 in func1) | runtime: set GOMAXPROCS to: 4
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/process/metadata/workloadmeta/extractor.go:84 in NewWorkloadMetaExtractor) | Instantiating a new WorkloadMetaExtractor
2024-12-20 00:05:03 UTC | CORE | INFO | (comp/core/tagger/impl/tagger.go:124 in NewComponent) | TaggerClient is created, defaultTagger type: *taggerimpl.localTagger
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/util/containers/metrics/system/collector_linux.go:87 in newSystemCollector) | Unable to initialize cgroup provider (cgroups not mounted?), err: unable to detect cgroup version from detected mount points: map[]
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/config/utils/endpoints.go:35 in getResolvedDDUrl) | 'site' and 'dd_url' are both set in config: setting main endpoint to 'dd_url': "https://app.datadoghq.com"
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/config/utils/endpoints.go:35 in getResolvedDDUrl) | 'site' and 'dd_url' are both set in config: setting main endpoint to 'dd_url': "https://app.datadoghq.com"
2024-12-20 00:05:03 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:05:03 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:150 in GetDogStatsDWorkerAndPipelineCount) | Dogstatsd configured to run with 2 workers and 1 pipelines
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/aggregator/time_sampler.go:54 in NewTimeSampler) | Creating TimeSampler #0
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/config/setup/config.go:2051 in LoadCustom) | Starting to load the configuration
2024-12-20 00:05:03 UTC | TRACE | INFO | (comp/trace/agent/impl/agent.go:107 in NewAgent) | trace-agent not enabled. Set the environment variable
DD_APM_ENABLED=true or add "apm_config.enabled: true" entry
to your datadog.yaml. Exiting...
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/api/security/security.go:142 in fetchAuthToken) | [/omnibus/src/datadog-agent/src/github.com/DataDog/datadog-agent/pkg/api/util/util.go:107] Creating a new authentication token
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/api/security/security.go:241 in saveAuthToken) | Saving a new authentication token in /etc/datadog-agent/auth_token
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/api/security/security.go:256 in saveAuthToken) | Wrote auth token in /etc/datadog-agent/auth_token
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/api/security/security.go:155 in fetchAuthToken) | Saved a new authentication token to /etc/datadog-agent/auth_token
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/api/security/cert/cert_getter.go:59 in fetchAgentIPCCert) | [/omnibus/src/datadog-agent/src/github.com/DataDog/datadog-agent/pkg/api/util/util.go:111] Creating a new IPC certificate
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/api/security/cert/cert_getter.go:107 in saveIPCCertKey) | Saving a new IPC certificate/key pair in /etc/datadog-agent/ipc_cert.pem
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/api/security/cert/cert_getter.go:126 in saveIPCCertKey) | Wrote IPC certificate/key pair in /etc/datadog-agent/ipc_cert.pem
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/api/security/cert/cert_getter.go:73 in fetchAgentIPCCert) | Saved a new IPC certificate/key pair to /etc/datadog-agent/ipc_cert.pem
2024-12-20 00:05:03 UTC | CORE | INFO | (comp/logs/agent/agentimpl/agent.go:176 in newLogsAgent) | logs-agent disabled
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (pkg/config/setup/config.go:2051 in LoadCustom) | Starting to load the configuration
2024-12-20 00:05:03 UTC | CORE | INFO | (comp/rdnsquerier/impl/rdnsquerier.go:83 in NewComponent) | Reverse DNS Enrichment config: (enabled=false workers=0 chan_size=0 cache.enabled=true cache.entry_ttl=0 cache.clean_interval=0 cache.persist_interval=0 cache.max_retries=-1 cache.max_size=0 rate_limiter.enabled=true rate_limiter.limit_per_sec=0 rate_limiter.limit_throttled_per_sec=0 rate_limiter.throttle_error_threshold=0 rate_limiter.recovery_intervals=0 rate_limiter.recovery_interval=0)
2024-12-20 00:05:03 UTC | CORE | INFO | (comp/netflow/server/server.go:63 in newServer) | Reverse DNS Enrichment is disabled for NDM NetFlow
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/collector/python/init.go:337 in resolvePythonExecPath) | Using '/opt/datadog-agent/embedded' as Python home
2024-12-20 00:05:03 UTC | CORE | INFO | (pkg/collector/python/init.go:403 in Initialize) | Initializing rtloader with Python 3 /opt/datadog-agent/embedded
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (pkg/config/setup/config.go:1709 in LoadProxyFromEnv) | Loading proxy settings
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (pkg/config/env/environment_containers.go:242 in detectPodResources) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (pkg/config/env/environment_detection.go:124 in DetectFeatures) | 0 Features detected from environment:
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (pkg/config/remote/client/client.go:400 in pollLoop) | retrying the first update of remote-config state (rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp [::1]:45371: connect: connection refused")
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:127 in Start) | remote workloadmeta initialized successfully
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/impl/store.go:554 in startCandidates) | workloadmeta collector "remote-workloadmeta" started successfully
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:199 in Start) | remote tagger initialized successfully
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:171 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :45371: connect: connection refused"
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:572 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :45371: connect: connection refused"
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (pkg/runtime/runtime.go:28 in func1) | runtime: set GOMAXPROCS to: 4
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (cmd/system-probe/subcommands/run/command.go:327 in startSystemProbe) | starting system-probe v7.62.0-devel+git.480.f5d99e9
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (cmd/system-probe/subcommands/run/command.go:439 in logUserAndGroupID) | current user id/name: 0/root
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (cmd/system-probe/subcommands/run/command.go:442 in logUserAndGroupID) | current group id/name: 0/root
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (cmd/system-probe/subcommands/run/command.go:332 in startSystemProbe) | system probe not enabled. exiting
2024-12-20 00:05:03 UTC | TRACE | INFO | (comp/core/tagger/impl-remote/remote.go:119 in func1) | remote tagger initialized successfully
2024-12-20 00:05:03 UTC | TRACE | INFO | ([email protected]+incompatible/retry.go:37 in RetryNotify) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :45371: connect: connection refused"
2024-12-20 00:05:03 UTC | TRACE | INFO | (comp/core/tagger/impl-remote/remote.go:122 in func2) | remote tagger stopped successfully
trace-agent exited with code 0, disabling
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:845 in func1) | running on platform: ubuntu
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:845 in func1) | running version: 7.62.0-devel
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:840 in func1) | Starting to load the configuration
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:840 in func1) | Loading proxy settings
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:840 in func1) | Starting to resolve secrets
2024-12-20 00:05:03 UTC | PROCESS | WARN | (pkg/util/log/log.go:885 in func1) | Agent configuration relax permissions constraint on the secret backend cmd, Group can read and exec
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:840 in func1) | Finished resolving secrets
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:845 in func1) | Agent did not find PodResources socket at unix:///var/lib/kubelet/pod-resources/kubelet.sock
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/util/log/log.go:845 in func1) | 0 Features detected from environment:
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/config/setup/config.go:2051 in LoadCustom) | Starting to load the configuration
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/rdnsquerier/impl/rdnsquerier.go:83 in NewComponent) | Reverse DNS Enrichment config: (enabled=false workers=0 chan_size=0 cache.enabled=true cache.entry_ttl=0 cache.clean_interval=0 cache.persist_interval=0 cache.max_retries=-1 cache.max_size=0 rate_limiter.enabled=true rate_limiter.limit_per_sec=0 rate_limiter.limit_throttled_per_sec=0 rate_limiter.throttle_error_threshold=0 rate_limiter.recovery_intervals=0 rate_limiter.recovery_interval=0)
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/tagger/impl/tagger.go:124 in NewComponent) | TaggerClient is created, defaultTagger type: *taggerimpl.localTagger
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/util/containers/metrics/system/collector_linux.go:87 in newSystemCollector) | Unable to initialize cgroup provider (cgroups not mounted?), err: unable to detect cgroup version from detected mount points: map[]
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/process/metadata/workloadmeta/extractor.go:84 in NewWorkloadMetaExtractor) | Instantiating a new WorkloadMetaExtractor
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:05:03 UTC | PROCESS | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/process/agent/agent_linux.go:61 in enabledHelper) | Process/Container Collection in the Process Agent will be deprecated in a future release and will instead be run in the Core Agent. Set process_config.run_in_core_agent.enabled to true to switch now.
2024-12-20 00:05:03 UTC | PROCESS | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-20 00:05:03 UTC | PROCESS | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/process/apiserver/apiserver.go:53 in newApiServer) | API server listening on localhost:6162
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process-events.datadoghq.com" (1 api key(s))
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/process/runner/submitter.go:182 in printStartMessage) | Starting CheckSubmitter for host=fv-az1117-366, endpoints=[https://process.datadoghq.com], events endpoints=[https://process-events.datadoghq.com]
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/process/runner/runner.go:286 in Run) | Starting process-agent with enabled checks=[process_discovery]
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs_fargate" could not start. error: component workloadmeta-ecs_fargate is disabled: Agent is not running on ECS Fargate
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kubelet" could not start. error: component workloadmeta-kubelet is disabled: Agent is not running on Kubernetes
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kube_metadata" could not start. error: component workloadmeta-kube_metadata is disabled: Agent is not running on Kubernetes
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-vm" could not start. error: component workloadmeta-cloudfoundry-vm is disabled: Agent is not running on CloudFoundry
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "podman" could not start. error: component workloadmeta-podman is disabled: Podman not detected
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "containerd" could not start. error: component workloadmeta-containerd is disabled: Agent is not running on containerd
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs" could not start. error: component workloadmeta-ecs is disabled: Agent is not running on ECS EC2
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "crio" could not start. error: component workloadmeta-crio is disabled: Crio not detected
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-container" could not start. error: component workloadmeta-cloudfoundry-container is disabled: Agent is not running on CloudFoundry
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "docker" could not start. error: component workloadmeta-docker is disabled: Agent is not running on Docker
2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/tagger/collectors/workloadmeta_main.go:153 in stream) | workloadmeta tagger collector started
2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/process/runner/runner.go:243 in logCheckDuration) | Finished process_discovery check #1 in 25.204577ms
2024-12-20 00:05:03 UTC | PROCESS | ERROR | (comp/forwarder/defaultforwarder/transaction/transaction.go:433 in internalProcess) | API Key invalid, dropping transaction for https://process.datadoghq.com/api/v1/discovery
2024-12-20 00:05:03 UTC | PROCESS | ERROR | (pkg/process/runner/runner.go:501 in readResponseStatuses) | [process_discovery] Invalid response from https://process.datadoghq.com: 403 -> <nil>
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:171 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :45371: connect: connection refused"
2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:572 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :45371: connect: connection refused"
2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/python/datadog_agent.go:148 in LogMessage) | - | (ddyaml.py:142) | monkey patching yaml.load...
2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/python/datadog_agent.go:148 in LogMessage) | - | (ddyaml.py:146) | monkey patching yaml.load_all...
2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/python/datadog_agent.go:148 in LogMessage) | - | (ddyaml.py:150) | monkey patching yaml.dump_all... (affects all yaml dump operations)
2024-12-20 00:05:04 UTC | SYS-PROBE | INFO | (pkg/config/remote/client/client.go:434 in pollLoop) | retrying the first update of remote-config state (rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp [::1]:45371: connect: connection refused")
2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/embed_python.go:22 in InitPython) | Embedding Python 3.12.6 (main, Dec 19 2024, 08:06:36) [GCC 11.4.0]
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled
2024-12-20 00:05:04 UTC | CORE | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-20 00:05:04 UTC | CORE | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)
2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/config/remote/service/util.go:110 in func1) | Missing meta bucket
2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/config/remote/service/util.go:131 in openCacheDB) | Different agent version or API Key detected
2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/config/remote/service/util.go:48 in recreate) | Clear remote configuration database
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/gui/guiimpl/gui.go:109 in newGui) | GUI server port -1 specified: not starting the GUI.
2024-12-20 00:05:04 UTC | CORE | WARN | (pkg/config/model/viper.go:264 in checkKnownKey) | config key runtime_security_config.sbom.enabled is unknown
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/listeners/types.go:84 in Register) | Service listener factory cloudfoundry_bbs does not exist.
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/listeners/types.go:84 in Register) | Service listener factory kube_endpoints does not exist.
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/listeners/types.go:84 in Register) | Service listener factory kube_services does not exist.
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/providers/config_reader.go:172 in read) | Searching for configuration files at: /etc/datadog-agent/conf.d
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "process-collector" could not start. error: collector process-collector is not enabled
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kubelet" could not start. error: component workloadmeta-kubelet is disabled: Agent is not running on Kubernetes
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-container" could not start. error: component workloadmeta-cloudfoundry-container is disabled: Agent is not running on CloudFoundry
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "podman" could not start. error: component workloadmeta-podman is disabled: Podman not detected
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "crio" could not start. error: component workloadmeta-crio is disabled: Crio not detected
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "docker" could not start. error: component workloadmeta-docker is disabled: Agent is not running on Docker
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "local-process-collector" could not start. error: component workloadmeta-process is disabled: language detection or core agent process collection is disabled
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "containerd" could not start. error: component workloadmeta-containerd is disabled: Agent is not running on containerd
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kube_metadata" could not start. error: component workloadmeta-kube_metadata is disabled: Agent is not running on Kubernetes
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs" could not start. error: component workloadmeta-ecs is disabled: Agent is not running on ECS EC2
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-vm" could not start. error: component workloadmeta-cloudfoundry-vm is disabled: Agent is not running on CloudFoundry
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs_fargate" could not start. error: component workloadmeta-ecs_fargate is disabled: Agent is not running on ECS Fargate
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/tagger/collectors/workloadmeta_main.go:153 in stream) | workloadmeta tagger collector started
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/providers/config_reader.go:172 in read) | Searching for configuration files at: /opt/datadog-agent/bin/agent/dist/conf.d
2024-12-20 00:05:04 UTC | CORE | WARN | (comp/core/autodiscovery/providers/config_reader.go:176 in read) | Skipping, open /opt/datadog-agent/bin/agent/dist/conf.d: no such file or directory
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/providers/config_reader.go:172 in read) | Searching for configuration files at:
2024-12-20 00:05:04 UTC | CORE | WARN | (comp/core/autodiscovery/providers/config_reader.go:176 in read) | Skipping, open : no such file or directory
2024-12-20 00:05:04 UTC | CORE | ERROR | (pkg/config/autodiscovery/autodiscovery.go:81 in DiscoverComponentsFromConfig) | Error unmarshalling snmp listener config. Error: no config given for snmp_listener
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/autoconfig.go:529 in initListenerCandidates) | environment listener successfully started
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/autoconfig.go:529 in initListenerCandidates) | static config listener successfully started
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://7-62-0-app.agent.datadoghq.com" (1 api key(s))
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://orchestrator.datadoghq.com" (1 api key(s))
2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/runner/runner.go:100 in ensureMinWorkers) | Runner 1 added 4 workers (total: 4)
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process-events.datadoghq.com" (1 api key(s))
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))
2024-12-20 00:05:04 UTC | CORE | INFO | (comp/dogstatsd/listeners/uds_datagram.go:65 in NewUDSDatagramListener) | dogstatsd-uds: /var/run/datadog/dsd.socket successfully initialized
2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines
2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/aggregator/dem…20 00:05:03 UTC | PROCESS | INFO | (comp/core/tagger/impl/tagger.go:124 in NewComponent) | TaggerClient is created, defaultTagger type: *taggerimpl.localTagger#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/util/containers/metrics/system/collector_linux.go:87 in newSystemCollector) | Unable to initialize cgroup provider (cgroups not mounted?), err: unable to detect cgroup version from detected mount points: map[]#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/process/metadata/workloadmeta/extractor.go:84 in NewWorkloadMetaExtractor) | Instantiating a new WorkloadMetaExtractor#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/process/agent/agent_linux.go:61 in enabledHelper) | Process/Container Collection in the Process Agent will be deprecated in a future release and will instead be run in the Core Agent. Set process_config.run_in_core_agent.enabled to true to switch now.#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/process/apiserver/apiserver.go:53 in newApiServer) | API server listening on localhost:6162#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process-events.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/process/runner/submitter.go:182 in printStartMessage) | Starting CheckSubmitter for host=fv-az1117-366, endpoints=[https://process.datadoghq.com], events endpoints=[https://process-events.datadoghq.com]#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/process/runner/runner.go:286 in Run) | Starting process-agent with enabled checks=[process_discovery]#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs_fargate" could not start. error: component workloadmeta-ecs_fargate is disabled: Agent is not running on ECS Fargate#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kubelet" could not start. error: component workloadmeta-kubelet is disabled: Agent is not running on Kubernetes#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kube_metadata" could not start. error: component workloadmeta-kube_metadata is disabled: Agent is not running on Kubernetes#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-vm" could not start. error: component workloadmeta-cloudfoundry-vm is disabled: Agent is not running on CloudFoundry#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "podman" could not start. error: component workloadmeta-podman is disabled: Podman not detected#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "containerd" could not start. error: component workloadmeta-containerd is disabled: Agent is not running on containerd#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs" could not start. error: component workloadmeta-ecs is disabled: Agent is not running on ECS EC2#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "crio" could not start. error: component workloadmeta-crio is disabled: Crio not detected#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-container" could not start. error: component workloadmeta-cloudfoundry-container is disabled: Agent is not running on CloudFoundry#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "docker" could not start. error: component workloadmeta-docker is disabled: Agent is not running on Docker#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (comp/core/tagger/collectors/workloadmeta_main.go:153 in stream) | workloadmeta tagger collector started#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | INFO | (pkg/process/runner/runner.go:243 in logCheckDuration) | Finished process_discovery check #1 in 25.204577ms#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | ERROR | (comp/forwarder/defaultforwarder/transaction/transaction.go:433 in internalProcess) | API Key invalid, dropping transaction for https://process.datadoghq.com/api/v1/discovery#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | PROCESS | ERROR | (pkg/process/runner/runner.go:501 in readResponseStatuses) | [process_discovery] Invalid response from https://process.datadoghq.com: 403 -> <nil>#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:171 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :45371: connect: connection refused"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:03 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:572 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :45371: connect: connection refused"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/python/datadog_agent.go:148 in LogMessage) | - | (ddyaml.py:142) | monkey patching yaml.load...#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/python/datadog_agent.go:148 in LogMessage) | - | (ddyaml.py:146) | monkey patching yaml.load_all...#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/python/datadog_agent.go:148 in LogMessage) | - | (ddyaml.py:150) | monkey patching yaml.dump_all... (affects all yaml dump operations)#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | SYS-PROBE | INFO | (pkg/config/remote/client/client.go:434 in pollLoop) | retrying the first update of remote-config state (rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp [::1]:45371: connect: connection refused")#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/embed_python.go:22 in InitPython) | Embedding Python 3.12.6 (main, Dec 19 2024, 08:06:36) [GCC 11.4.0]#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:264 in NewDefaultForwarder) | Retry queue storage on disk is disabled#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | WARN | (pkg/process/checks/checks.go:140 in canEnableContainerChecks) | Disabled container checks because no container environment detected (see list of detected features in `agent status`)#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/config/remote/service/util.go:110 in func1) | Missing meta bucket#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/config/remote/service/util.go:131 in openCacheDB) | Different agent version or API Key detected#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/config/remote/service/util.go:48 in recreate) | Clear remote configuration database#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/gui/guiimpl/gui.go:109 in newGui) | GUI server port -1 specified: not starting the GUI.#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | WARN | (pkg/config/model/viper.go:264 in checkKnownKey) | config key runtime_security_config.sbom.enabled is unknown#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/listeners/types.go:84 in Register) | Service listener factory cloudfoundry_bbs does not exist.#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/listeners/types.go:84 in Register) | Service listener factory kube_endpoints does not exist.#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/listeners/types.go:84 in Register) | Service listener factory kube_services does not exist.#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/providers/config_reader.go:172 in read) | Searching for configuration files at: /etc/datadog-agent/conf.d#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "process-collector" could not start. error: collector process-collector is not enabled#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kubelet" could not start. error: component workloadmeta-kubelet is disabled: Agent is not running on Kubernetes#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-container" could not start. error: component workloadmeta-cloudfoundry-container is disabled: Agent is not running on CloudFoundry#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "podman" could not start. error: component workloadmeta-podman is disabled: Podman not detected#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "crio" could not start. error: component workloadmeta-crio is disabled: Crio not detected#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "docker" could not start. error: component workloadmeta-docker is disabled: Agent is not running on Docker#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "local-process-collector" could not start. error: component workloadmeta-process is disabled: language detection or core agent process collection is disabled#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "containerd" could not start. error: component workloadmeta-containerd is disabled: Agent is not running on containerd#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "kube_metadata" could not start. error: component workloadmeta-kube_metadata is disabled: Agent is not running on Kubernetes#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs" could not start. error: component workloadmeta-ecs is disabled: Agent is not running on ECS EC2#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "cloudfoundry-vm" could not start. error: component workloadmeta-cloudfoundry-vm is disabled: Agent is not running on CloudFoundry#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/workloadmeta/impl/store.go:557 in startCandidates) | workloadmeta collector "ecs_fargate" could not start. error: component workloadmeta-ecs_fargate is disabled: Agent is not running on ECS Fargate#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/tagger/collectors/workloadmeta_main.go:153 in stream) | workloadmeta tagger collector started#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/providers/config_reader.go:172 in read) | Searching for configuration files at: /opt/datadog-agent/bin/agent/dist/conf.d#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | WARN | (comp/core/autodiscovery/providers/config_reader.go:176 in read) | Skipping, open /opt/datadog-agent/bin/agent/dist/conf.d: no such file or directory#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/providers/config_reader.go:172 in read) | Searching for configuration files at: #x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | WARN | (comp/core/autodiscovery/providers/config_reader.go:176 in read) | Skipping, open : no such file or directory#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | ERROR | (pkg/config/autodiscovery/autodiscovery.go:81 in DiscoverComponentsFromConfig) | Error unmarshalling snmp listener config. Error: no config given for snmp_listener#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/autoconfig.go:529 in initListenerCandidates) | environment listener successfully started#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/autoconfig.go:529 in initListenerCandidates) | static config listener successfully started#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://7-62-0-app.agent.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://orchestrator.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/runner/runner.go:100 in ensureMinWorkers) | Runner 1 added 4 workers (total: 4)#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process-events.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/default_forwarder.go:419 in Start) | Forwarder started, sending to 1 endpoint(s) with 1 worker(s) each: "https://process.datadoghq.com" (1 api key(s))#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/dogstatsd/listeners/uds_datagram.go:65 in NewUDSDatagramListener) | dogstatsd-uds: /var/run/datadog/dsd.socket successfully initialized#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:150 in GetDogStatsDWorkerAndPipelineCount) | Dogstatsd configured to run with 2 workers and 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:150 in GetDogStatsDWorkerAndPipelineCount) | Dogstatsd configured to run with 2 workers and 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:232 in getDogStatsDWorkerAndPipelineCount) | Dogstatsd workers and pipelines count: 2 workers, 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/aggregator/demultiplexer.go:150 in GetDogStatsDWorkerAndPipelineCount) | Dogstatsd configured to run with 2 workers and 1 pipelines#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/remote-config/rcservice/rcserviceimpl/rcservice.go:115 in func1) | remote config service started#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/agenttelemetry/impl/agenttelemetry.go:521 in start) | Starting agent telemetry for 2 schedules and 4 profiles#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/dogstatsd/listeners/uds_datagram.go:79 in listen) | dogstatsd-uds: starting to listen on /var/run/datadog/dsd.socket#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/dogstatsd/listeners/udp.go:128 in listen) | dogstatsd-udp: starting to listen on 127.0.0.1:8125#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/config/remote/client/client.go:400 in pollLoop) | retrying the first update of remote-config state (rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp [::1]:45371: connect: connection refused")#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/api/api/apiimpl/server.go:31 in startServer) | Started HTTP server 'CMD API Server' on 127.0.0.1:45371#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (cmd/agent/subcommands/run/command.go:512 in startAgent) | Starting Datadog Agent v7.62.0-devel+git.480.f5d99e9#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (cmd/agent/subcommands/run/command.go:538 in startAgent) | Hostname is: fv-az1117-366#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/util/installinfo/install_info.go:94 in logVersionHistoryToFile) | Cannot read file: /opt/datadog-agent/run/version-history.json, will create a new one. open /opt/datadog-agent/run/version-history.json: no such file or directory#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | WARN | (pkg/collector/python/check_context.go:54 in initializeCheckContext) | Log receiver not provided. Logs from integrations will not be collected.#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/config_poller.go:170 in collectOnce) | file provider: collected 65 new configurations, removed 0#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | ERROR | (pkg/collector/scheduler.go:213 in getChecks) | Unable to load a check from instance of config 'gatekeeper': Python Check Loader: unable to import module 'gatekeeper': No module named 'gatekeeper'; Core Check Loader: Check gatekeeper not found in Catalog#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/core/autodiscovery/autodiscoveryimpl/autoconfig.go:409 in LoadAndRun) | Started config provider "file"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check container_image with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check container_lifecycle with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check cpu with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | WARN | (comp/forwarder/defaultforwarder/forwarder_health.go:297 in checkValidAPIKey) | api_key '***************************aaaaa' for domain https://api.datadoghq.com is invalid#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | ERROR | (comp/forwarder/defaultforwarder/forwarder_health.go:148 in healthCheckLoop) | No valid api key found, reporting the forwarder as unhealthy.#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check disk:67cc0574430a16ba with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check file_handle with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check io with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check load with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check memory with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check network:4b0649b7e11f0772 with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:171 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :45371: connect: connection refused"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/util/cloudproviders/cloudproviders.go:89 in GetCloudProviderNTPHosts) | Detected Azure cloud provider environment with NTP server(s) at ["time.windows.com"]#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/corechecks/net/ntp/ntp.go:131 in parse) | Using NTP servers: [ time.windows.com ]#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check ntp:3c427a42a70bbf8 with an interval of 15m0s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check service_discovery with an interval of 1m0s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check telemetry with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (pkg/collector/scheduler/scheduler.go:93 in Enter) | Scheduling check uptime with an interval of 15s#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/forwarder/defaultforwarder/transaction/transaction.go:454 in internalProcess) | Successfully posted payload to "https://7-62-0-app.agent.datadoghq.com/intake/" (202 Accepted), the agent will only log transaction success every 500 transactions#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | SYS-PROBE | INFO | (comp/core/tagger/impl-remote/remote.go:572 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :45371: connect: connection refused"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/metadata/host/hostimpl/hosttags/tags.go:165 in Get) | Unable to get host tags from source: gce - using cached host tags#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:04 UTC | CORE | INFO | (comp/metadata/host/hostimpl/utils/host.go:105 in getNetworkMeta) | could not get network metadata: could not detect network ID#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | CORE | INFO | (pkg/util/containers/metrics/provider/registry.go:102 in collectorDiscovery) | Container metrics provider discovery process finished#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | SYS-PROBE | INFO | (pkg/config/remote/client/client.go:434 in pollLoop) | retrying the first update of remote-config state (rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp [::1]:45371: connect: connection refused")#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | PROCESS | INFO | (pkg/util/containers/metrics/provider/registry.go:102 in collectorDiscovery) | Container metrics provider discovery process finished#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | SYS-PROBE | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:171 in func1) | unable to establish stream, will possibly retry: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp :45371: connect: connection refused"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | SECURITY | INFO | (pkg/security/utils/hostname.go:97 in GetHostnameWithContextAndFallback) | Hostname is: fv-az1117-366#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | SECURITY | INFO | (subcommands/runtime/command.go:704 in StartRuntimeSecurity) | Datadog runtime security agent disabled by config#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | SECURITY | INFO | (pkg/security/utils/hostname.go:97 in GetHostnameWithContextAndFallback) | Hostname is: fv-az1117-366#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | SECURITY | INFO | (comp/core/workloadmeta/impl/store.go:100 in start) | workloadmeta store initialized successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | SECURITY | INFO | (subcommands/start/command.go:267 in RunAgent) | All security-agent components are deactivated, exiting#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | SECURITY | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:127 in Start) | remote workloadmeta initialized successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | SECURITY | INFO | (comp/core/workloadmeta/impl/store.go:554 in startCandidates) | workloadmeta collector "remote-workloadmeta" started successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | SECURITY | INFO | (comp/core/workloadmeta/collectors/internal/remote/generic.go:175 in func1) | workloadmeta stream established successfully#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:40 in CheckStarted) | check:container_image | Running check...#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | CORE | INFO | (pkg/collector/corechecks/containerimage/check.go:136 in Run) | Starting long-running check "container_image"#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:59 in CheckFinished) | check:container_image | Done running check#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:40 in CheckStarted) | check:service_discovery | Running check...#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:40 in CheckStarted) | check:ntp | Running check...#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:59 in CheckFinished) | check:service_discovery | Done running check#x1B[0m
#x1B[1m#x1B[31mE 2024-12-20 00:05:05 UTC | CORE | INFO | (pkg/collector/worker/check_logger.go:59 in CheckFinished) | check:ntp | Done running check#x1B[0m
#x1B[1m#x1B[31mE grep: /etc/datadog-agent/system-probe.yaml: No such file or directory#x1B[0m
#x1B[1m#x1B[31mE grep: /etc/datadog-agent/system-probe.yaml: No such file or directory#x1B[0m
#x1B[1m#x1B[31mE #x1B[0m
#x1B[1m#x1B[31mE #x1B[0m
#x1B[1m#x1B[31mE Error: could not load gatekeeper:#x1B[0m
#x1B[1m#x1B[31mE * Python Check Loader: unable to import module 'gatekeeper': No module named 'gatekeeper'#x1B[0m
#x1B[1m#x1B[31mE * Core Check Loader: Check gatekeeper not found in Catalog#x1B[0m
#x1B[1m#x1B[31mE Error: no valid check found#x1B[0m
#x1B[1m#x1B[31mE #x1B[0m
#x1B[1m#x1B[31mE Could not find valid check output#x1B[0m
Check warning on line 0 in cybersixgill_actionable_alerts.tests.test_instance
github-actions / Test Results
test_instance_config_initialization (cybersixgill_actionable_alerts.tests.test_instance) failed
test-results/cybersixgill_actionable_alerts/test-unit-py3.12.xml [took 0s]
Raw output
TypeError: 'NoneType' object is not subscriptable
#x1B[1m#x1B[31mtests/test_instance.py#x1B[0m:9: in test_instance_config_initialization
#x1B[0mconfig = InstanceConfig(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mdatadog_checks/cybersixgill_actionable_alerts/config_models/instance.py#x1B[0m:55: in _validate
#x1B[0m#x1B[94mif#x1B[39;49;00m field_name #x1B[95min#x1B[39;49;00m info.context[#x1B[33m'#x1B[39;49;00m#x1B[33mconfigured_fields#x1B[39;49;00m#x1B[33m'#x1B[39;49;00m]:#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE TypeError: 'NoneType' object is not subscriptable#x1B[0m
Check warning on line 0 in cybersixgill_actionable_alerts.tests.test_shared
github-actions / Test Results
test_shared_config_service (cybersixgill_actionable_alerts.tests.test_shared) failed
test-results/cybersixgill_actionable_alerts/test-unit-py3.12.xml [took 0s]
Raw output
TypeError: 'NoneType' object is not subscriptable
#x1B[1m#x1B[31mtests/test_shared.py#x1B[0m:10: in test_shared_config_service
#x1B[0mshared_config = SharedConfig(service=#x1B[33m"#x1B[39;49;00m#x1B[33mmy_service#x1B[39;49;00m#x1B[33m"#x1B[39;49;00m)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mdatadog_checks/cybersixgill_actionable_alerts/config_models/shared.py#x1B[0m:34: in _validate
#x1B[0m#x1B[94mif#x1B[39;49;00m field_name #x1B[95min#x1B[39;49;00m info.context[#x1B[33m'#x1B[39;49;00m#x1B[33mconfigured_fields#x1B[39;49;00m#x1B[33m'#x1B[39;49;00m]:#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE TypeError: 'NoneType' object is not subscriptable#x1B[0m
Check failure on line 0 in grpc_check
github-actions / Test Results
tests.test_grpc_check (grpc_check) with error
test-results/gRPC Check/test-unit-py3.12.xml [took 0s]
Raw output
collection failure
#x1B[1m#x1B[31mtests/test_grpc_check.py#x1B[0m:5: in <module>
#x1B[0m#x1B[94mfrom#x1B[39;49;00m #x1B[04m#x1B[96mgrpc_health#x1B[39;49;00m#x1B[04m#x1B[96m.#x1B[39;49;00m#x1B[04m#x1B[96mv1#x1B[39;49;00m #x1B[94mimport#x1B[39;49;00m health, health_pb2, health_pb2_grpc#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m<frozen importlib._bootstrap>#x1B[0m:1360: in _find_and_load
#x1B[0m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m<frozen importlib._bootstrap>#x1B[0m:1331: in _find_and_load_unlocked
#x1B[0m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m<frozen importlib._bootstrap>#x1B[0m:935: in _load_unlocked
#x1B[0m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-grpc-check/ERX8M07z/py3.12/lib/python3.12/site-packages/ddtrace/internal/module.py#x1B[0m:250: in _exec_module
#x1B[0m#x1B[96mself#x1B[39;49;00m.loader.exec_module(module)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-grpc-check/ERX8M07z/py3.12/lib/python3.12/site-packages/grpc_health/v1/health.py#x1B[0m:21: in <module>
#x1B[0m#x1B[94mfrom#x1B[39;49;00m #x1B[04m#x1B[96mgrpc_health#x1B[39;49;00m#x1B[04m#x1B[96m.#x1B[39;49;00m#x1B[04m#x1B[96mv1#x1B[39;49;00m #x1B[94mimport#x1B[39;49;00m health_pb2 #x1B[94mas#x1B[39;49;00m _health_pb2#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m<frozen importlib._bootstrap>#x1B[0m:1360: in _find_and_load
#x1B[0m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m<frozen importlib._bootstrap>#x1B[0m:1331: in _find_and_load_unlocked
#x1B[0m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m<frozen importlib._bootstrap>#x1B[0m:935: in _load_unlocked
#x1B[0m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[04m#x1B[91m?#x1B[39;49;00m#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-grpc-check/ERX8M07z/py3.12/lib/python3.12/site-packages/ddtrace/internal/module.py#x1B[0m:250: in _exec_module
#x1B[0m#x1B[96mself#x1B[39;49;00m.loader.exec_module(module)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-grpc-check/ERX8M07z/py3.12/lib/python3.12/site-packages/grpc_health/v1/health_pb2.py#x1B[0m:12: in <module>
#x1B[0m_runtime_version.ValidateProtobufRuntimeVersion(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-grpc-check/ERX8M07z/py3.12/lib/python3.12/site-packages/google/protobuf/runtime_version.py#x1B[0m:86: in ValidateProtobufRuntimeVersion
#x1B[0m#x1B[94mraise#x1B[39;49;00m VersionError(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE google.protobuf.runtime_version.VersionError: Detected incompatible Protobuf Gencode/Runtime versions when loading grpc_health/v1/health.proto: gencode 5.28.1 runtime 5.27.3. Runtime version cannot be older than the linked gencode version. See Protobuf version guarantees at https://protobuf.dev/support/cross-version-runtime-guarantee.#x1B[0m
Check warning on line 0 in gnatsd.tests.test_gnatsd
github-actions / Test Results
test_metrics (gnatsd.tests.test_gnatsd) failed
test-results/Gnatsd/test-unit-py3.12.xml [took 41s]
Raw output
datadog_checks.dev.errors.SubprocessError: Command: ['docker', 'inspect', '-f', '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}', 'docker_nats_serverA_1']
Exit code: 1
Captured Output:
#x1B[1m#x1B[31mtests/test_gnatsd.py#x1B[0m:58: in test_metrics
#x1B[0mroute_ip = get_container_ip(#x1B[33m'#x1B[39;49;00m#x1B[33mdocker_nats_serverA_1#x1B[39;49;00m#x1B[33m'#x1B[39;49;00m).replace(#x1B[33m'#x1B[39;49;00m#x1B[33m.#x1B[39;49;00m#x1B[33m'#x1B[39;49;00m, #x1B[33m'#x1B[39;49;00m#x1B[33m_#x1B[39;49;00m#x1B[33m'#x1B[39;49;00m)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd/opB47563/py3.12/lib/python3.12/site-packages/datadog_checks/dev/docker.py#x1B[0m:42: in get_container_ip
#x1B[0m#x1B[94mreturn#x1B[39;49;00m run_command(command, capture=#x1B[33m'#x1B[39;49;00m#x1B[33mout#x1B[39;49;00m#x1B[33m'#x1B[39;49;00m, check=#x1B[94mTrue#x1B[39;49;00m).stdout.strip()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd/opB47563/py3.12/lib/python3.12/site-packages/datadog_checks/dev/subprocess.py#x1B[0m:75: in run_command
#x1B[0m#x1B[94mraise#x1B[39;49;00m SubprocessError(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE datadog_checks.dev.errors.SubprocessError: Command: ['docker', 'inspect', '-f', '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}', 'docker_nats_serverA_1']#x1B[0m
#x1B[1m#x1B[31mE Exit code: 1#x1B[0m
#x1B[1m#x1B[31mE Captured Output:#x1B[0m
Check warning on line 0 in gnatsd.tests.test_gnatsd
github-actions / Test Results
test_metric_tags (gnatsd.tests.test_gnatsd) failed
test-results/Gnatsd/test-unit-py3.12.xml [took 0s]
Raw output
datadog_checks.dev.errors.SubprocessError: Command: ['docker', 'inspect', '-f', '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}', 'docker_nats_serverA_1']
Exit code: 1
Captured Output:
#x1B[1m#x1B[31mtests/test_gnatsd.py#x1B[0m:80: in test_metric_tags
#x1B[0mroute_ip = get_container_ip(#x1B[33m'#x1B[39;49;00m#x1B[33mdocker_nats_serverA_1#x1B[39;49;00m#x1B[33m'#x1B[39;49;00m).replace(#x1B[33m'#x1B[39;49;00m#x1B[33m.#x1B[39;49;00m#x1B[33m'#x1B[39;49;00m, #x1B[33m'#x1B[39;49;00m#x1B[33m_#x1B[39;49;00m#x1B[33m'#x1B[39;49;00m)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd/opB47563/py3.12/lib/python3.12/site-packages/datadog_checks/dev/docker.py#x1B[0m:42: in get_container_ip
#x1B[0m#x1B[94mreturn#x1B[39;49;00m run_command(command, capture=#x1B[33m'#x1B[39;49;00m#x1B[33mout#x1B[39;49;00m#x1B[33m'#x1B[39;49;00m, check=#x1B[94mTrue#x1B[39;49;00m).stdout.strip()#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-gnatsd/opB47563/py3.12/lib/python3.12/site-packages/datadog_checks/dev/subprocess.py#x1B[0m:75: in run_command
#x1B[0m#x1B[94mraise#x1B[39;49;00m SubprocessError(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE datadog_checks.dev.errors.SubprocessError: Command: ['docker', 'inspect', '-f', '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}', 'docker_nats_serverA_1']#x1B[0m
#x1B[1m#x1B[31mE Exit code: 1#x1B[0m
#x1B[1m#x1B[31mE Captured Output:#x1B[0m
Check warning on line 0 in lighthouse.tests.test_lighthouse
github-actions / Test Results
test_check_urls_tags (lighthouse.tests.test_lighthouse) failed
test-results/Lighthouse/test-unit-py3.12.xml [took 0s]
Raw output
AssertionError: Needed at least 1 candidates for 'lighthouse.accessibility', got 0
Expected:
MetricStub(name='lighthouse.accessibility', type=None, value=92, tags=['key:value', 'name:test', 'url:https://www.google.com'], hostname=None, device=None, flush_first_value=None)
Similar submitted:
Score Most similar
0.97 MetricStub(name='lighthouse.accessibility', type=0, value=92.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)
0.95 MetricStub(name='lighthouse.accessibility', type=0, value=92.0, tags=['name:test', 'url:https://www.datadoghq.com'], hostname='', device=None, flush_first_value=False)
0.57 MetricStub(name='lighthouse.best_practices', type=0, value=100.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)
0.55 MetricStub(name='lighthouse.seo', type=0, value=89.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)
0.55 MetricStub(name='lighthouse.pwa', type=0, value=30.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)
0.54 MetricStub(name='lighthouse.best_practices', type=0, value=100.0, tags=['name:test', 'url:https://www.datadoghq.com'], hostname='', device=None, flush_first_value=False)
0.54 MetricStub(name='lighthouse.performance', type=0, value=55.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)
0.54 MetricStub(name='lighthouse.dom_size', type=0, value=2721.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)
0.53 MetricStub(name='lighthouse.seo', type=0, value=89.0, tags=['name:test', 'url:https://www.datadoghq.com'], hostname='', device=None, flush_first_value=False)
0.53 MetricStub(name='lighthouse.pwa', type=0, value=30.0, tags=['name:test', 'url:https://www.datadoghq.com'], hostname='', device=None, flush_first_value=False)
0.52 MetricStub(name='lighthouse.unused_javascript', type=0, value=1200.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)
0.52 MetricStub(name='lighthouse.max_potential_fid', type=0, value=208.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)
0.52 MetricStub(name='lighthouse.performance', type=0, value=55.0, tags=['name:test', 'url:https://www.datadoghq.com'], hostname='', device=None, flush_first_value=False)
0.52 MetricStub(name='lighthouse.dom_size', type=0, value=2721.0, tags=['name:test', 'url:https://www.datadoghq.com'], hostname='', device=None, flush_first_value=False)
0.51 MetricStub(name='lighthouse.speed_index', type=0, value=4442.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)
assert False
#x1B[1m#x1B[31mtests/test_lighthouse.py#x1B[0m:76: in test_check_urls_tags
#x1B[0maggregator.assert_metric(name=#x1B[33m"#x1B[39;49;00m#x1B[33mlighthouse.accessibility#x1B[39;49;00m#x1B[33m"#x1B[39;49;00m, value=#x1B[94m92#x1B[39;49;00m, tags=expected_tags)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-lighthouse/gM-7B_k3/py3.12/lib/python3.12/site-packages/datadog_checks/base/stubs/aggregator.py#x1B[0m:370: in assert_metric
#x1B[0m#x1B[96mself#x1B[39;49;00m._assert(condition, msg=msg, expected_stub=expected_metric, submitted_elements=#x1B[96mself#x1B[39;49;00m._metrics)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-lighthouse/gM-7B_k3/py3.12/lib/python3.12/site-packages/datadog_checks/base/stubs/aggregator.py#x1B[0m:412: in _assert
#x1B[0m#x1B[94massert#x1B[39;49;00m condition, new_msg#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE AssertionError: Needed at least 1 candidates for 'lighthouse.accessibility', got 0#x1B[0m
#x1B[1m#x1B[31mE Expected:#x1B[0m
#x1B[1m#x1B[31mE MetricStub(name='lighthouse.accessibility', type=None, value=92, tags=['key:value', 'name:test', 'url:https://www.google.com'], hostname=None, device=None, flush_first_value=None)#x1B[0m
#x1B[1m#x1B[31mE Similar submitted:#x1B[0m
#x1B[1m#x1B[31mE Score Most similar#x1B[0m
#x1B[1m#x1B[31mE 0.97 MetricStub(name='lighthouse.accessibility', type=0, value=92.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.95 MetricStub(name='lighthouse.accessibility', type=0, value=92.0, tags=['name:test', 'url:https://www.datadoghq.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.57 MetricStub(name='lighthouse.best_practices', type=0, value=100.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.55 MetricStub(name='lighthouse.seo', type=0, value=89.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.55 MetricStub(name='lighthouse.pwa', type=0, value=30.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.54 MetricStub(name='lighthouse.best_practices', type=0, value=100.0, tags=['name:test', 'url:https://www.datadoghq.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.54 MetricStub(name='lighthouse.performance', type=0, value=55.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.54 MetricStub(name='lighthouse.dom_size', type=0, value=2721.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.53 MetricStub(name='lighthouse.seo', type=0, value=89.0, tags=['name:test', 'url:https://www.datadoghq.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.53 MetricStub(name='lighthouse.pwa', type=0, value=30.0, tags=['name:test', 'url:https://www.datadoghq.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.52 MetricStub(name='lighthouse.unused_javascript', type=0, value=1200.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.52 MetricStub(name='lighthouse.max_potential_fid', type=0, value=208.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.52 MetricStub(name='lighthouse.performance', type=0, value=55.0, tags=['name:test', 'url:https://www.datadoghq.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.52 MetricStub(name='lighthouse.dom_size', type=0, value=2721.0, tags=['name:test', 'url:https://www.datadoghq.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.51 MetricStub(name='lighthouse.speed_index', type=0, value=4442.0, tags=['name:test', 'url:https://www.google.com'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE assert False#x1B[0m
Check warning on line 0 in redis_sentinel.tests.test_redis_sentinel
github-actions / Test Results
test_check (redis_sentinel.tests.test_redis_sentinel) failed
test-results/Redis Sentinel/test-unit-py3.12.xml [took 6s]
Raw output
AssertionError: Needed exactly 1 candidates for 'redis.sentinel.slave_is_disconnected', got 0
Expected:
ServiceCheckStub(check_id=None, name='redis.sentinel.slave_is_disconnected', status=0, tags=None, hostname=None, message=None)
Similar submitted:
Score Most similar
0.93 ServiceCheckStub(check_id='', name='redis.sentinel.master_is_disconnected', status=0, tags=['master_ip:172.18.0.2', 'redis_name:mymaster'], hostname='', message='')
0.80 ServiceCheckStub(check_id='', name='redis.sentinel.master_is_down', status=0, tags=['master_ip:172.18.0.2', 'redis_name:mymaster'], hostname='', message='')
0.75 ServiceCheckStub(check_id='', name='redis.sentinel.slave_is_disconnected', status=2, tags=['redis_name:mymaster', 'slave_ip:172.18.0.3'], hostname='', message='')
0.53 ServiceCheckStub(check_id='', name='redis.sentinel.slave_master_link_down', status=2, tags=['redis_name:mymaster', 'slave_ip:172.18.0.3'], hostname='', message='')
assert False
#x1B[1m#x1B[31mtests/test_redis_sentinel.py#x1B[0m:76: in test_check
#x1B[0maggregator.assert_service_check(svc_chk, status=RedisSentinelCheck.OK, count=#x1B[94m1#x1B[39;49;00m)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-redis-sentinel/scFV5prl/py3.12/lib/python3.12/site-packages/datadog_checks/base/stubs/aggregator.py#x1B[0m:403: in assert_service_check
#x1B[0m#x1B[96mself#x1B[39;49;00m._assert(#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-redis-sentinel/scFV5prl/py3.12/lib/python3.12/site-packages/datadog_checks/base/stubs/aggregator.py#x1B[0m:412: in _assert
#x1B[0m#x1B[94massert#x1B[39;49;00m condition, new_msg#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE AssertionError: Needed exactly 1 candidates for 'redis.sentinel.slave_is_disconnected', got 0#x1B[0m
#x1B[1m#x1B[31mE Expected:#x1B[0m
#x1B[1m#x1B[31mE ServiceCheckStub(check_id=None, name='redis.sentinel.slave_is_disconnected', status=0, tags=None, hostname=None, message=None)#x1B[0m
#x1B[1m#x1B[31mE Similar submitted:#x1B[0m
#x1B[1m#x1B[31mE Score Most similar#x1B[0m
#x1B[1m#x1B[31mE 0.93 ServiceCheckStub(check_id='', name='redis.sentinel.master_is_disconnected', status=0, tags=['master_ip:172.18.0.2', 'redis_name:mymaster'], hostname='', message='')#x1B[0m
#x1B[1m#x1B[31mE 0.80 ServiceCheckStub(check_id='', name='redis.sentinel.master_is_down', status=0, tags=['master_ip:172.18.0.2', 'redis_name:mymaster'], hostname='', message='')#x1B[0m
#x1B[1m#x1B[31mE 0.75 ServiceCheckStub(check_id='', name='redis.sentinel.slave_is_disconnected', status=2, tags=['redis_name:mymaster', 'slave_ip:172.18.0.3'], hostname='', message='')#x1B[0m
#x1B[1m#x1B[31mE 0.53 ServiceCheckStub(check_id='', name='redis.sentinel.slave_master_link_down', status=2, tags=['redis_name:mymaster', 'slave_ip:172.18.0.3'], hostname='', message='')#x1B[0m
#x1B[1m#x1B[31mE assert False#x1B[0m
Check warning on line 0 in tidb.tests.test_tidb
github-actions / Test Results
test_cluster_metrics (tidb.tests.test_tidb) failed
test-results/TiDB/test-unit-py3.12.xml [took 29s]
Raw output
AssertionError: Needed at least 1 candidates for 'tidb_cluster.tikv_store_size_bytes', got 0
Expected:
MetricStub(name='tidb_cluster.tikv_store_size_bytes', type=None, value=None, tags=['tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:available'], hostname=None, device=None, flush_first_value=None)
Similar submitted:
Score Most similar
0.87 MetricStub(name='tidb_cluster.tikv_engine_size_bytes', type=0, value=14680832.0, tags=['db:kv', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:write'], hostname='', device=None, flush_first_value=False)
0.87 MetricStub(name='tidb_cluster.tikv_engine_size_bytes', type=0, value=14681016.0, tags=['db:kv', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:raft'], hostname='', device=None, flush_first_value=False)
0.87 MetricStub(name='tidb_cluster.tikv_engine_size_bytes', type=0, value=3670784.0, tags=['db:kv', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:lock'], hostname='', device=None, flush_first_value=False)
0.87 MetricStub(name='tidb_cluster.tikv_engine_size_bytes', type=0, value=904.0, tags=['db:kv', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:default'], hostname='', device=None, flush_first_value=False)
0.87 MetricStub(name='tidb_cluster.tikv_engine_size_bytes', type=0, value=1488.0, tags=['db:raft', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:default'], hostname='', device=None, flush_first_value=False)
0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:read', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:load_balance'], hostname='', device=None, flush_first_value=False)
0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:write', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:load_balance'], hostname='', device=None, flush_first_value=False)
0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:read', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:gc'], hostname='', device=None, flush_first_value=False)
0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=163840.0, tags=['op:read', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:other'], hostname='', device=None, flush_first_value=False)
0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:read', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:flush'], hostname='', device=None, flush_first_value=False)
0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:write', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:gc'], hostname='', device=None, flush_first_value=False)
0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=44561.0, tags=['op:write', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:other'], hostname='', device=None, flush_first_value=False)
0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:write', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:flush'], hostname='', device=None, flush_first_value=False)
0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:read', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:import'], hostname='', device=None, flush_first_value=False)
0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:read', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:export'], hostname='', device=None, flush_first_value=False)
assert False
#x1B[1m#x1B[31mtests/test_tidb.py#x1B[0m:54: in test_cluster_metrics
#x1B[0m_check_and_assert(aggregator, EXPECTED_TIKV, check)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mtests/test_tidb.py#x1B[0m:60: in _check_and_assert
#x1B[0magg.assert_metric(name, tags=tags)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-tidb/BfU1eFRf/py3.12/lib/python3.12/site-packages/datadog_checks/base/stubs/aggregator.py#x1B[0m:370: in assert_metric
#x1B[0m#x1B[96mself#x1B[39;49;00m._assert(condition, msg=msg, expected_stub=expected_metric, submitted_elements=#x1B[96mself#x1B[39;49;00m._metrics)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-tidb/BfU1eFRf/py3.12/lib/python3.12/site-packages/datadog_checks/base/stubs/aggregator.py#x1B[0m:412: in _assert
#x1B[0m#x1B[94massert#x1B[39;49;00m condition, new_msg#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE AssertionError: Needed at least 1 candidates for 'tidb_cluster.tikv_store_size_bytes', got 0#x1B[0m
#x1B[1m#x1B[31mE Expected:#x1B[0m
#x1B[1m#x1B[31mE MetricStub(name='tidb_cluster.tikv_store_size_bytes', type=None, value=None, tags=['tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:available'], hostname=None, device=None, flush_first_value=None)#x1B[0m
#x1B[1m#x1B[31mE Similar submitted:#x1B[0m
#x1B[1m#x1B[31mE Score Most similar#x1B[0m
#x1B[1m#x1B[31mE 0.87 MetricStub(name='tidb_cluster.tikv_engine_size_bytes', type=0, value=14680832.0, tags=['db:kv', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:write'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.87 MetricStub(name='tidb_cluster.tikv_engine_size_bytes', type=0, value=14681016.0, tags=['db:kv', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:raft'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.87 MetricStub(name='tidb_cluster.tikv_engine_size_bytes', type=0, value=3670784.0, tags=['db:kv', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:lock'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.87 MetricStub(name='tidb_cluster.tikv_engine_size_bytes', type=0, value=904.0, tags=['db:kv', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:default'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.87 MetricStub(name='tidb_cluster.tikv_engine_size_bytes', type=0, value=1488.0, tags=['db:raft', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:default'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:read', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:load_balance'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:write', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:load_balance'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:read', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:gc'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=163840.0, tags=['op:read', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:other'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:read', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:flush'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:write', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:gc'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=44561.0, tags=['op:write', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:other'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:write', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:flush'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:read', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:import'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE 0.84 MetricStub(name='tidb_cluster.tikv_io_bytes', type=3, value=0.0, tags=['op:read', 'tidb_cluster_component:tikv', 'tidb_cluster_name:test', 'type:export'], hostname='', device=None, flush_first_value=False)#x1B[0m
#x1B[1m#x1B[31mE assert False#x1B[0m
Check warning on line 0 in snmpwalk.tests.test_snmpwalk
github-actions / Test Results
test_check (snmpwalk.tests.test_snmpwalk) failed
test-results/snmpwalk/test-unit-py3.12.xml [took 6s]
Raw output
AssertionError: Needed at least 1 candidates for 'snmpwalk.ifInOctets', got 0
Expected:
MetricStub(name='snmpwalk.ifInOctets', type=None, value=None, tags=None, hostname=None, device=None, flush_first_value=None)
Similar submitted:
Score Most similar
assert False
#x1B[1m#x1B[31mtests/test_snmpwalk.py#x1B[0m:47: in test_check
#x1B[0maggregator.assert_metric(metric_name, at_least=#x1B[94m1#x1B[39;49;00m)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-snmpwalk/W-ATyytC/py3.12/lib/python3.12/site-packages/datadog_checks/base/stubs/aggregator.py#x1B[0m:370: in assert_metric
#x1B[0m#x1B[96mself#x1B[39;49;00m._assert(condition, msg=msg, expected_stub=expected_metric, submitted_elements=#x1B[96mself#x1B[39;49;00m._metrics)#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31m../../../../.local/share/hatch/env/virtual/datadog-snmpwalk/W-ATyytC/py3.12/lib/python3.12/site-packages/datadog_checks/base/stubs/aggregator.py#x1B[0m:412: in _assert
#x1B[0m#x1B[94massert#x1B[39;49;00m condition, new_msg#x1B[90m#x1B[39;49;00m
#x1B[1m#x1B[31mE AssertionError: Needed at least 1 candidates for 'snmpwalk.ifInOctets', got 0#x1B[0m
#x1B[1m#x1B[31mE Expected:#x1B[0m
#x1B[1m#x1B[31mE MetricStub(name='snmpwalk.ifInOctets', type=None, value=None, tags=None, hostname=None, device=None, flush_first_value=None)#x1B[0m
#x1B[1m#x1B[31mE Similar submitted:#x1B[0m
#x1B[1m#x1B[31mE Score Most similar#x1B[0m
#x1B[1m#x1B[31mE #x1B[0m
#x1B[1m#x1B[31mE assert False#x1B[0m
Check notice on line 0 in .github
github-actions / Test Results
1 skipped test found
There is 1 skipped test, see "Raw output" for the name of the skipped test.
Raw output
octoprint.tests.test_octoprint ‑ test_check
Check notice on line 0 in .github
github-actions / Test Results
358 tests found
There are 358 tests, see "Raw output" for the full list of tests.
Raw output
aqua.tests.test_aqua ‑ test_check
aws_pricing.tests.test_aws_pricing ‑ test_check_checkexception
aws_pricing.tests.test_aws_pricing ‑ test_check_describe_services_clienterror
aws_pricing.tests.test_aws_pricing ‑ test_check_get_products_clienterror
aws_pricing.tests.test_aws_pricing ‑ test_check_ok
aws_pricing.tests.test_aws_pricing ‑ test_check_warning
bind9.tests.test_check ‑ test_DateTimeToEpoch
bind9.tests.test_check ‑ test_check
celerdata.tests.test_celerdata ‑ test_celerdata_be
celerdata.tests.test_celerdata ‑ test_celerdata_fe
cfssl.tests.test_cfssl ‑ test_config_empty
cfssl.tests.test_cfssl ‑ test_invalid_config
cfssl.tests.test_cfssl ‑ test_service_check
cloudnatix.tests.test_cloudnatix ‑ test_cloudnatix
cloudsmith.tests.test_cloudsmith ‑ test_api_key_none
cloudsmith.tests.test_cloudsmith ‑ test_check
cloudsmith.tests.test_cloudsmith ‑ test_check_bad_usage
cloudsmith.tests.test_cloudsmith ‑ test_check_badly_formatted_json
cloudsmith.tests.test_cloudsmith ‑ test_empty_instance
cloudsmith.tests.test_cloudsmith ‑ test_org_none
cloudsmith.tests.test_cloudsmith ‑ test_uri_none
cybersixgill_actionable_alerts.tests.test_cybersixgill_actionable_alerts ‑ test_check
cybersixgill_actionable_alerts.tests.test_cybersixgill_actionable_alerts ‑ test_config_empty
cybersixgill_actionable_alerts.tests.test_cybersixgill_actionable_alerts ‑ test_file_data
cybersixgill_actionable_alerts.tests.test_cybersixgill_actionable_alerts ‑ test_file_data_with_null_date
cybersixgill_actionable_alerts.tests.test_cybersixgill_actionable_alerts ‑ test_invalid_config
cybersixgill_actionable_alerts.tests.test_cybersixgill_actionable_alerts ‑ test_set_file_path_null
cybersixgill_actionable_alerts.tests.test_instance ‑ test_instance_config_initialization
cybersixgill_actionable_alerts.tests.test_shared ‑ test_shared_config_service
cyral.tests.test_cyral ‑ test_check_all_metrics
emqx.tests.test_emqx ‑ test_mock_assert_metrics
eventstore.tests.test_check ‑ test_config
eventstore.tests.test_check ‑ test_integration
exim.tests.test_exim ‑ test_check
exim.tests.test_exim ‑ test_emits_critical_service_check_when_service_is_down
exim.tests.test_exim ‑ test_get_queue_stats
exim.tests.test_exim ‑ test_get_queue_stats_empty
exim.tests.test_exim ‑ test_parse_size
fiddler.tests.test_fiddler ‑ test_call_failure
fiddler.tests.test_fiddler ‑ test_call_success
fiddler.tests.test_fiddler ‑ test_check
fiddler.tests.test_fiddler ‑ test_create_tags
fiddler.tests.test_fiddler ‑ test_get_metrics_failure
fiddler.tests.test_fiddler ‑ test_get_metrics_success
fiddler.tests.test_fiddler ‑ test_get_model_failure
fiddler.tests.test_fiddler ‑ test_get_model_success
fiddler.tests.test_fiddler ‑ test_get_project_failure
fiddler.tests.test_fiddler ‑ test_get_project_success
fiddler.tests.test_fiddler ‑ test_initialization
fiddler.tests.test_fiddler ‑ test_metric_collection
fiddler.tests.test_fiddler ‑ test_run_queries_failure
fiddler.tests.test_fiddler ‑ test_run_queries_success
fiddler.tests.test_fiddler ‑ test_service_check
filebeat.tests.test_filebeat ‑ test_bad_config
filebeat.tests.test_filebeat ‑ test_check
filebeat.tests.test_filebeat ‑ test_check_fail
filebeat.tests.test_filebeat ‑ test_default_timeout[init_config0-instance0-expected_timeout0]
filebeat.tests.test_filebeat ‑ test_default_timeout[init_config1-instance1-expected_timeout1]
filebeat.tests.test_filebeat ‑ test_default_timeout[init_config2-instance2-expected_timeout2]
filebeat.tests.test_filebeat ‑ test_happy_path
filebeat.tests.test_filebeat ‑ test_happy_path_with_an_only_metrics_list
filebeat.tests.test_filebeat ‑ test_http_profiler_not_a_dict
filebeat.tests.test_filebeat ‑ test_ignore_registry
filebeat.tests.test_filebeat ‑ test_instance_tags
filebeat.tests.test_filebeat ‑ test_missing_registry_file
filebeat.tests.test_filebeat ‑ test_missing_source_file
filebeat.tests.test_filebeat ‑ test_negative_timeout
filebeat.tests.test_filebeat ‑ test_normalize_metrics
filebeat.tests.test_filebeat ‑ test_normalize_metrics_with_an_only_metrics_list
filebeat.tests.test_filebeat ‑ test_only_metrics_not_a_list
filebeat.tests.test_filebeat ‑ test_port_absent
filebeat.tests.test_filebeat ‑ test_port_not_an_int
filebeat.tests.test_filebeat ‑ test_regexes_only_get_compiled_and_run_once
filebeat.tests.test_filebeat ‑ test_registry_happy_path
filebeat.tests.test_filebeat ‑ test_registry_happy_path_with_legacy_format
filebeat.tests.test_filebeat ‑ test_registry_happy_path_with_new_style_format
filebeat.tests.test_filebeat ‑ test_source_file_device_has_changed
filebeat.tests.test_filebeat ‑ test_source_file_inode_has_changed
filebeat.tests.test_filebeat ‑ test_timeout_not_a_number
filebeat.tests.test_filebeat ‑ test_when_filebeat_restarts
filebeat.tests.test_filebeat ‑ test_when_the_http_call_times_out
filebeat.tests.test_filebeat ‑ test_when_the_http_connection_is_refused
filebeat.tests.test_filebeat ‑ test_with_an_invalid_regex_in_the_only_metrics_list
filebeat.tests.test_filebeat ‑ test_with_two_different_instances
filemage.tests.test_filemage ‑ test_bad_instance
filemage.tests.test_filemage ‑ test_check_coverage
filemage.tests.test_filemage ‑ test_good_instance
filemage.tests.test_filemage ‑ test_metric_coverage
filemage.tests.test_filemage ‑ test_metrics_down
filemage.tests.test_filemage ‑ test_metrics_up
filemage.tests.test_filemage ‑ test_services_down
filemage.tests.test_filemage ‑ test_services_up
fluentbit.tests.test_fluent_bit ‑ test_check
fluentbit.tests.test_fluent_bit ‑ test_check_integration
flume.tests.test_e2e ‑ test_e2e
gatekeeper.tests.test_e2e ‑ test_check_ok
gatekeeper.tests.test_gatekeeper ‑ test_audit_metrics
gatekeeper.tests.test_gatekeeper ‑ test_check
gatekeeper.tests.test_gatekeeper ‑ test_config
gatekeeper.tests.test_gatekeeper ‑ test_controller_metrics
gatekeeper.tests.test_gatekeeper ‑ test_openmetrics_error
gitea.tests.test_gitea ‑ test_check_integration_assert_metrics
gitea.tests.test_gitea ‑ test_check_integration_assert_metrics_using_metadata
gitea.tests.test_gitea ‑ test_check_integration_assert_service_check
gitea.tests.test_gitea ‑ test_mock_assert_metrics
gitea.tests.test_gitea ‑ test_mock_assert_metrics_using_metadata
gitea.tests.test_gitea ‑ test_mock_assert_service_check
gnatsd.tests.test_gnatsd ‑ test_connection_failure
gnatsd.tests.test_gnatsd ‑ test_deltas
gnatsd.tests.test_gnatsd ‑ test_metric_tags
gnatsd.tests.test_gnatsd ‑ test_metrics
gnatsd_streaming.tests.test_gnatsd_streaming ‑ test_connection_failure
gnatsd_streaming.tests.test_gnatsd_streaming ‑ test_deltas
gnatsd_streaming.tests.test_gnatsd_streaming ‑ test_failover_event
gnatsd_streaming.tests.test_gnatsd_streaming ‑ test_metric_tags
gnatsd_streaming.tests.test_gnatsd_streaming ‑ test_metrics
go_pprof_scraper.tests.test_go_pprof_scraper ‑ test_check
go_pprof_scraper.tests.test_go_pprof_scraper ‑ test_config
go_pprof_scraper.tests.test_go_pprof_scraper ‑ test_e2e
go_pprof_scraper.tests.test_go_pprof_scraper ‑ test_emits_critical_service_check_when_service_is_down
grpc_check ‑ tests.test_grpc_check
hikaricp.tests.test_hikaricp ‑ test_mock_assert_micrometer_metrics
hikaricp.tests.test_hikaricp ‑ test_mock_assert_prometheus_metrics
kepler.tests.test_e2e ‑ test_e2e
kepler.tests.test_unit ‑ test_check
kernelcare.tests.test_kernelcare ‑ test_config
kernelcare.tests.test_kernelcare ‑ test_metric
lighthouse.tests.test_lighthouse ‑ test_check
lighthouse.tests.test_lighthouse ‑ test_check_urls_tags
lighthouse.tests.test_lighthouse ‑ test_invalid_response_check
lighthouse.tests.test_lighthouse ‑ test_malformed_url_instance_check
lighthouse.tests.test_lighthouse ‑ test_missing_name_instance_check
lighthouse.tests.test_lighthouse ‑ test_missing_url_instance_check
logstash.tests.test_logstash ‑ test_check[logstash5]
logstash.tests.test_logstash ‑ test_check[logstash6]
logstash.tests.test_logstash ‑ test_check[logstash7]
logstash.tests.test_logstash ‑ test_failed_connection[logstash5]
logstash.tests.test_logstash ‑ test_failed_connection[logstash6]
logstash.tests.test_logstash ‑ test_failed_connection[logstash7]
mergify.tests.test_mergify ‑ test_check
mergify.tests.test_mergify ‑ test_check_empty_values
mergify.tests.test_mergify ‑ test_emits_critical_service_check_when_service_is_down
mergify.tests.test_mergify ‑ test_emits_warning_when_ratelimited
neo4j.tests.test_neo4j ‑ test_neo4j[neo4j4]
neo4j.tests.test_neo4j ‑ test_neo4j[neo4j5]
neutrona.tests.test_neutrona ‑ test_config
neutrona.tests.test_neutrona ‑ test_metrics
neutrona.tests.test_neutrona_demo ‑ test_config
neutrona.tests.test_neutrona_demo ‑ test_metrics
nextcloud.tests.test_nextcloud ‑ test_empty_url
nextcloud.tests.test_nextcloud ‑ test_invalid_url
nextcloud.tests.test_nextcloud ‑ test_valid_check
nn_sdwan.tests.test_nn_sdwan ‑ test_empty_instance
nn_sdwan.tests.test_nn_sdwan ‑ test_metrics_received
nn_sdwan.tests.test_nn_sdwan ‑ test_missing_hostname
nn_sdwan.tests.test_nn_sdwan ‑ test_missing_password
nn_sdwan.tests.test_nn_sdwan ‑ test_missing_protocol
nn_sdwan.tests.test_nn_sdwan ‑ test_missing_username
nn_sdwan.tests.test_nn_sdwan ‑ test_normal_instance
ns1.tests.test_ns1 ‑ test_config
ns1.tests.test_ns1 ‑ test_empty_instance
ns1.tests.test_ns1 ‑ test_extractPulsarAvailabilityPercent
ns1.tests.test_ns1 ‑ test_extractPulsarResponseTime
ns1.tests.test_ns1 ‑ test_extract_billing
ns1.tests.test_ns1 ‑ test_extract_peak_lps
ns1.tests.test_ns1 ‑ test_extract_qps
ns1.tests.test_ns1 ‑ test_extract_records_ttl
ns1.tests.test_ns1 ‑ test_get_pulsar_app
ns1.tests.test_ns1 ‑ test_no_key
ns1.tests.test_ns1 ‑ test_no_metrics
ns1.tests.test_ns1 ‑ test_parse_metrics
ns1.tests.test_ns1 ‑ test_pulsar_count
ns1.tests.test_ns1 ‑ test_read_prev_usage_count
ns1.tests.test_ns1 ‑ test_remove_prefix
ns1.tests.test_ns1 ‑ test_url_gen
ns1.tests.test_ns1 ‑ test_usage_count
octoprint.tests.test_octoprint ‑ test_active_job
octoprint.tests.test_octoprint ‑ test_check
octoprint.tests.test_octoprint ‑ test_e2e
octoprint.tests.test_octoprint ‑ test_empty_job
open_policy_agent.tests.test_open_policy_agent ‑ test_check
open_policy_agent.tests.test_open_policy_agent ‑ test_config
open_policy_agent.tests.test_open_policy_agent ‑ test_metrics
open_policy_agent.tests.test_open_policy_agent ‑ test_openmetrics_error
open_policy_agent.tests.test_open_policy_agent ‑ test_service_check
open_policy_agent.tests.test_open_policy_agent ‑ test_service_check_error
php_apcu.tests.test_php_apcu ‑ test_config
php_apcu.tests.test_php_apcu ‑ test_metrics
php_apcu.tests.test_php_apcu ‑ test_service_check
php_opcache.tests.test_php_opcache ‑ test_config
php_opcache.tests.test_php_opcache ‑ test_metrics
php_opcache.tests.test_php_opcache ‑ test_service_check
pihole.tests.test_pihole ‑ test_bad_response
pihole.tests.test_pihole ‑ test_bad_status
pihole.tests.test_pihole ‑ test_config
pihole.tests.test_pihole ‑ test_good_response
pihole.tests.test_pihole ‑ test_invalid_config
pihole.tests.test_pihole ‑ test_no_status
pihole.tests.test_pihole ‑ test_service_check
ping.tests.test_ping ‑ test_empty_check
ping.tests.test_ping ‑ test_incorrect_ip_check
ping.tests.test_ping ‑ test_integration
ping.tests.test_ping ‑ test_integration_ipv6
ping.tests.test_ping ‑ test_integration_response_time
ping.tests.test_ping ‑ test_valid_check
ping.tests.test_ping ‑ test_valid_check_ipv6
portworx.tests.test_portworx ‑ test_check_all_metrics
puma.tests.test_puma ‑ test_check
puma.tests.test_puma ‑ test_config
puma.tests.test_puma ‑ test_metrics_for_puma_in_cluster_mode
puma.tests.test_puma ‑ test_metrics_for_puma_in_single_mode
purefa.tests.test_purefa ‑ test_check
purefb.tests.test_purefb ‑ test_check
qdrant.tests.test_e2e ‑ test_emits_metrics
qdrant.tests.test_unit ‑ test_empty_instance
radarr.tests.test_radarr ‑ test_check
radarr.tests.test_radarr ‑ test_emits_critical_service_check_when_api_key_is_invalid
radarr.tests.test_radarr ‑ test_emits_critical_service_check_when_service_is_down
radarr.tests.test_radarr ‑ test_process_movies
radarr.tests.test_radarr ‑ test_service_check
reboot_required.tests.test_reboot_required ‑ test_critical
reboot_required.tests.test_reboot_required ‑ test_not_present_ok
reboot_required.tests.test_reboot_required ‑ test_ok
reboot_required.tests.test_reboot_required ‑ test_warning
redis_cloud.tests.test_redis_cloud ‑ test_end_to_end
redis_cloud.tests.test_redis_cloud ‑ test_instance_additional_check
redis_cloud.tests.test_redis_cloud ‑ test_instance_exclude_metrics
redis_cloud.tests.test_redis_cloud ‑ test_instance_invalid_group_check
redis_cloud.tests.test_redis_cloud ‑ test_invalid_instance
redis_enterprise.tests.test_redis_enterprise ‑ test_end_to_end
redis_enterprise.tests.test_redis_enterprise ‑ test_instance_additional_check
redis_enterprise.tests.test_redis_enterprise ‑ test_instance_all_additional_check
redis_enterprise.tests.test_redis_enterprise ‑ test_instance_exclude_metrics
redis_enterprise.tests.test_redis_enterprise ‑ test_instance_invalid_group_check
redis_enterprise.tests.test_redis_enterprise ‑ test_invalid_instance
redis_sentinel.tests.test_redis_sentinel ‑ test_check
redis_sentinel.tests.test_redis_sentinel ‑ test_down_slaves
redis_sentinel.tests.test_redis_sentinel ‑ test_load_config
redisenterprise.tests.test_redisenterprise ‑ test_check
redisenterprise.tests.test_redisenterprise ‑ test_version
redpanda.tests.test_redpanda ‑ test_check
redpanda.tests.test_redpanda ‑ test_instance_additional_check
redpanda.tests.test_redpanda ‑ test_instance_default_check
redpanda.tests.test_redpanda ‑ test_instance_full_additional_check
redpanda.tests.test_redpanda ‑ test_instance_invalid_group_check
redpanda.tests.test_redpanda ‑ test_invalid_instance
resin.tests.test_resin ‑ test_e2e
riak_repl.tests.test_check ‑ test_check
riak_repl.tests.test_check ‑ test_config
robust_intelligence_ai_firewall.tests.test_robust_intelligence_ai_firewall ‑ test_check
scalr.tests.test_scalr ‑ test_check
scalr.tests.test_scalr ‑ test_emits_critical_service_check_when_service_is_down
scaphandre.tests.test_e2e ‑ test_check_scaphandre_e2e
scaphandre.tests.test_unit ‑ test_check
sendmail.tests.test_sendmail ‑ test_bad_configuration
sendmail.tests.test_sendmail ‑ test_bad_sendmail_command
sendmail.tests.test_sendmail ‑ test_queue_output
snmpwalk.tests.test_snmpwalk ‑ test_check
snmpwalk.tests.test_snmpwalk ‑ test_unavailable_binary
sonarr.tests.test_sonarr ‑ test_check
sonarr.tests.test_sonarr ‑ test_emits_critical_service_check_when_api_key_is_invalid
sonarr.tests.test_sonarr ‑ test_emits_critical_service_check_when_service_is_down
sonarr.tests.test_sonarr ‑ test_process_episodes
sonarr.tests.test_sonarr ‑ test_process_missing
sonarr.tests.test_sonarr ‑ test_process_series
sonarr.tests.test_sonarr ‑ test_service_check
sortdb.tests.test_sortdb ‑ test_check
speedtest.tests.test_speedtest ‑ test_check
speedtest.tests.test_speedtest ‑ test_config
stardog.tests.test_stardog ‑ test_check_all_metrics
storm.tests.test_storm ‑ test_check
storm.tests.test_storm ‑ test_get_storm_cluster_summary
storm.tests.test_storm ‑ test_get_storm_nimbus_summary
storm.tests.test_storm ‑ test_get_storm_supervisor_summary
storm.tests.test_storm ‑ test_get_storm_topology_info
storm.tests.test_storm ‑ test_get_storm_topology_metrics
storm.tests.test_storm ‑ test_get_storm_topology_summary
storm.tests.test_storm ‑ test_integration_with_ci_cluster
storm.tests.test_storm ‑ test_load_from_config
storm.tests.test_storm ‑ test_process_cluster_stats
storm.tests.test_storm ‑ test_process_nimbus_stats
storm.tests.test_storm ‑ test_process_supervisor_stats
storm.tests.test_storm ‑ test_process_topology_metrics
storm.tests.test_storm ‑ test_process_topology_stats
syncthing.tests.test_syncthing ‑ test_check
syncthing.tests.test_syncthing ‑ test_emits_critical_service_check_when_service_is_down
tidb.tests.test_tidb ‑ test_cluster_metrics
tidb.tests.test_tidb ‑ test_create_check_instance_transform
tidb.tests.test_tidb ‑ test_tidb_mock_metrics
tidb.tests.test_tidb ‑ test_tiflash_mock_metrics
tidb.tests.test_tidb ‑ test_tiflash_proxy_mock_metrics
tidb.tests.test_tidb ‑ test_tikv_mock_metrics
trino.tests.test_e2e ‑ test_e2e
unbound.tests.test_unbound ‑ test_basic_stats_1_4_22
unbound.tests.test_unbound ‑ test_basic_stats_1_9_2
unbound.tests.test_unbound ‑ test_extended_stats_1_4_22
unbound.tests.test_unbound ‑ test_extended_stats_1_9_2
unbound.tests.test_unbound ‑ test_hostname_with_port
unbound.tests.test_unbound ‑ test_hostname_without_port
unbound.tests.test_unbound ‑ test_multithread_stats
unbound.tests.test_unbound ‑ test_no_sudo
unbound.tests.test_unbound ‑ test_nonexistent_unbound_control
unbound.tests.test_unbound ‑ test_nonexistent_unbound_control_with_sudo
unbound.tests.test_unbound ‑ test_unbound_control_empty_output
unbound.tests.test_unbound ‑ test_unbound_control_exception
unbound.tests.test_unbound ‑ test_unbound_control_non_zero_return_code
unbound.tests.test_unbound ‑ test_unbound_on_root_path_but_not_current_users_path
unbound.tests.test_unbound ‑ test_wacky_output
unifi_console.tests.test_check ‑ test__initiate_api_connection
unifi_console.tests.test_check ‑ test__submit_healthy_metrics
unifi_console.tests.test_check ‑ test__submit_metrics[metric0-0]
unifi_console.tests.test_check ‑ test__submit_metrics[metric1-2]
unifi_console.tests.test_check ‑ test__submit_metrics[metric2-1]
unifi_console.tests.test_check ‑ test_check_status_fail
unifi_console.tests.test_check ‑ test_check_status_pass
unifi_console.tests.test_check ‑ test_get_clients_info_fails
unifi_console.tests.test_check ‑ test_get_devices_info_fails
unifi_console.tests.test_check ‑ test_metrics_submission
unifi_console.tests.test_unifi ‑ test__checkNewStyleAPI[200-True]
unifi_console.tests.test_unifi ‑ test__checkNewStyleAPI[400-False]
unifi_console.tests.test_unifi ‑ test__checkNewStyleAPI_raise
unifi_console.tests.test_unifi ‑ test__get_json
unifi_console.tests.test_unifi ‑ test__init
unifi_console.tests.test_unifi ‑ test__path[False-/api/login-/api/login]
unifi_console.tests.test_unifi ‑ test__path[True-/api/auth/login-/api/auth/login]
unifi_console.tests.test_unifi ‑ test__path[True-/api/login-/api/auth/login]
unifi_console.tests.test_unifi ‑ test__path[True-/proxy/network/test-/proxy/network/test]
unifi_console.tests.test_unifi ‑ test__path[True-/status-/proxy/network/status]
unifi_console.tests.test_unifi ‑ test_connection_failure
unifi_console.tests.test_unifi ‑ test_connection_success
unifi_console.tests.test_unifi ‑ test_get_clients_metrics
unifi_console.tests.test_unifi ‑ test_get_devices_metrics
unifi_console.tests.test_unifi ‑ test_smart_retry[exception0-2]
unifi_console.tests.test_unifi ‑ test_smart_retry[exception1-1]
unifi_console.tests.test_unifi ‑ test_smart_retry[exception2-1]
unifi_console.tests.test_unifi ‑ test_status
upsc.tests.test_upsc ‑ test_check
upsc.tests.test_upsc ‑ test_convert_and_filter_stats
upsc.tests.test_upsc ‑ test_list_ups_devices
upsc.tests.test_upsc ‑ test_load_from_config
upsc.tests.test_upsc ‑ test_query_ups_device
vespa.tests.test_vespa ‑ test_cannot_connect_is_critical
vespa.tests.test_vespa ‑ test_check_counters_are_reset_between_check_calls
vespa.tests.test_vespa ‑ test_check_metrics
vespa.tests.test_vespa ‑ test_down_service_does_not_raise
vespa.tests.test_vespa ‑ test_no_consumer_raises
vespa.tests.test_vespa ‑ test_no_services_object_in_json_yields_metrics_health_warning
vespa.tests.test_vespa ‑ test_service_reports_down
vespa.tests.test_vespa ‑ test_service_reports_unknown
wayfinder.tests.test_wayfinder ‑ test_config
wayfinder.tests.test_wayfinder ‑ test_mock_assert_metrics
wayfinder.tests.test_wayfinder ‑ test_service_check
zabbix.tests.test_integration ‑ test_e2e
zabbix.tests.test_zabbix ‑ test_empty_instance
zabbix.tests.test_zabbix ‑ test_missing_pass
zabbix.tests.test_zabbix ‑ test_missing_url
zenoh_router.tests.test_unit ‑ test_check
zenoh_router.tests.test_unit ‑ test_emits_critical_service_check_when_service_is_down