Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docker_logs source stopped collecting logs after restart of docker.service #19681

Closed
dgo- opened this issue Jan 22, 2024 · 1 comment
Closed
Labels
type: bug A code related bug.

Comments

@dgo-
Copy link

dgo- commented Jan 22, 2024

A note for the community

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Problem

After a restart of the docker.service on the server vector stop collecting logs from docker containers.

Normal info logs:

Jan 22 11:45:31 1.docker.staging vector[2053868]: 2024-01-22T10:45:31.887824Z  INFO source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::internal_events::docker_logs: Stopped watching for container logs. container_id=6632c9fc1432046515db508b1082e13d22ae3e167bae87ed04d9bbd771c628a7
Jan 22 11:45:31 1.docker.staging vector[2053868]: 2024-01-22T10:45:31.887898Z  INFO source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::internal_events::docker_logs: Started watching for container logs. container_id=6632c9fc1432046515db508b1082e13d22ae3e167bae87ed04d9bbd771c628a7
Jan 22 11:45:33 1.docker.staging vector[2053868]: 2024-01-22T10:45:33.740472Z  INFO source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::internal_events::docker_logs: Stopped watching for container logs. container_id=25fac86600490d04f6b3ef2ab980b3e97765e8002f5b3aa56c5ef45b1414e0d0
Jan 22 11:45:33 1.docker.staging vector[2053868]: 2024-01-22T10:45:33.740543Z  INFO source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::internal_events::docker_logs: Started watching for container logs. container_id=25fac86600490d04f6b3ef2ab980b3e97765e8002f5b3aa56c5ef45b1414e0d0
Jan 22 11:45:39 1.docker.staging vector[2053868]: 2024-01-22T10:45:39.740111Z ERROR source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::internal_events::docker_logs: Error in communication with Docker daemon. error=IOError { err: Custom { kind: Other, error: "error reading a body from connection: unexpected EOF during chunk size line" } } error_type="connection_failed" stage="receiving" container_id=None internal_log_rate_limit=true

Jan 22 11:45:45 1.docker.staging vector[2053868]: 2024-01-22T10:45:45.694472Z ERROR source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::internal_events::docker_logs: Internal log [Error in communication with Docker daemon.] is being suppressed to avoid flooding.
Jan 22 11:45:45 1.docker.staging vector[2053868]: 2024-01-22T10:45:45.694501Z  INFO source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::internal_events::docker_logs: Stopped watching for container logs. container_id=6632c9fc1432046515db508b1082e13d22ae3e167bae87ed04d9bbd771c628a7
Jan 22 11:45:45 1.docker.staging vector[2053868]: 2024-01-22T10:45:45.695436Z  INFO source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::internal_events::docker_logs: Stopped watching for container logs. container_id=25fac86600490d04f6b3ef2ab980b3e97765e8002f5b3aa56c5ef45b1414e0d0

Configuration

sources:
  # system logs by journald
  src_journald:
    type: "journald"
    exclude_units:
      - "vector.service"

  # docker container logs
  src_docker:
    type: docker_logs

# exclude transforms and sinks because they contain sensitive information.

Version

vector 0.35.0 (x86_64-unknown-linux-gnu e57c0c0 2024-01-08 14:42:10.103908779)

Debug Output

Jan 22 11:53:17 1.docker.staging vector[2093433]: 2024-01-22T10:53:17.278284Z DEBUG hyper::proto::h1::io: flushed 8305 bytes
Jan 22 11:53:17 1.docker.staging vector[2093433]: 2024-01-22T10:53:17.303514Z DEBUG hyper::proto::h1::io: parsed 4 headers
Jan 22 11:53:17 1.docker.staging vector[2093433]: 2024-01-22T10:53:17.303539Z DEBUG hyper::proto::h1::conn: incoming body is content-length (337 bytes)
Jan 22 11:53:17 1.docker.staging vector[2093433]: 2024-01-22T10:53:17.303562Z DEBUG hyper::proto::h1::conn: incoming body completed
Jan 22 11:53:17 1.docker.staging vector[2093433]: 2024-01-22T10:53:17.303604Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=80}:http: hyper::client::pool: pooling idle connection for ("http", 2.elasticsearch:9200)
Jan 22 11:53:17 1.docker.staging vector[2093433]: 2024-01-22T10:53:17.303632Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=80}:http: vector::internal_events::http_client: HTTP response. status=200 OK version=HTTP/1.1 headers={"x-el>
Jan 22 11:53:18 1.docker.staging vector[2093433]: 2024-01-22T10:53:18.257141Z DEBUG vector::internal_events::file::source: Files checkpointed. count=0 duration_ms=0
Jan 22 11:53:18 1.docker.staging vector[2093433]: 2024-01-22T10:53:18.279510Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}: tower::balance::p2c::service: updating from discover
Jan 22 11:53:18 1.docker.staging vector[2093433]: 2024-01-22T10:53:18.279544Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}: tower::buffer::worker: service.ready=true processing request
Jan 22 11:53:18 1.docker.staging vector[2093433]: 2024-01-22T10:53:18.279591Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=81}:http: vector::internal_events::http_client: Sending HTTP request. uri=http://1.elasticsearch:9200/>
Jan 22 11:53:18 1.docker.staging vector[2093433]: 2024-01-22T10:53:18.279617Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=81}:http: hyper::client::pool: reuse idle connection for ("http", 1.elasticsearch:9200)
Jan 22 11:53:18 1.docker.staging vector[2093433]: 2024-01-22T10:53:18.279693Z DEBUG hyper::proto::h1::io: flushed 6146 bytes
Jan 22 11:53:18 1.docker.staging vector[2093433]: 2024-01-22T10:53:18.291123Z DEBUG hyper::proto::h1::io: parsed 4 headers
Jan 22 11:53:18 1.docker.staging vector[2093433]: 2024-01-22T10:53:18.291141Z DEBUG hyper::proto::h1::conn: incoming body is content-length (313 bytes)
Jan 22 11:53:18 1.docker.staging vector[2093433]: 2024-01-22T10:53:18.291157Z DEBUG hyper::proto::h1::conn: incoming body completed
Jan 22 11:53:18 1.docker.staging vector[2093433]: 2024-01-22T10:53:18.291182Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=81}:http: hyper::client::pool: pooling idle connection for ("http", 1.elasticsearch:9200)
Jan 22 11:53:18 1.docker.staging vector[2093433]: 2024-01-22T10:53:18.291204Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=81}:http: vector::internal_events::http_client: HTTP response. status=200 OK version=HTTP/1.1 headers={"x-el>
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.133768Z DEBUG source{component_kind="source" component_id=src_pods component_type=file}:file_server: file_source::file_server: event_throughput=0.000/sec bytes_throughput=0.000/sec ratios={"discovery": 1.2169031e-5, "other": 3.249>
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.258339Z DEBUG vector::internal_events::file::source: Files checkpointed. count=0 duration_ms=0
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.280779Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}: tower::balance::p2c::service: updating from discover
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.280814Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}: tower::buffer::worker: service.ready=true processing request
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.280863Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=82}:http: vector::internal_events::http_client: Sending HTTP request. uri=http://2.elasticsearch:9200/>
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.280890Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=82}:http: hyper::client::pool: reuse idle connection for ("http", 2.elasticsearch:9200)
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.280973Z DEBUG hyper::proto::h1::io: flushed 6786 bytes
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.298011Z DEBUG hyper::proto::h1::io: parsed 4 headers
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.298031Z DEBUG hyper::proto::h1::conn: incoming body is content-length (316 bytes)
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.298046Z DEBUG hyper::proto::h1::conn: incoming body completed
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.298071Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=82}:http: hyper::client::pool: pooling idle connection for ("http", 2.elasticsearch:9200)
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.298230Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=82}:http: vector::internal_events::http_client: HTTP response. status=200 OK version=HTTP/1.1 headers={"x-el>
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.429118Z DEBUG hyper::proto::h1::conn: incoming body decode error: unexpected EOF during chunk size line
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.429225Z ERROR source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::internal_events::docker_logs: Error in communication with Docker daemon. error=IOError { err: Custom { kind: Other, error: ">
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.429253Z DEBUG source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::topology::builder: Source finished normally.
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.441953Z DEBUG transform{component_kind="transform" component_id=trans_remap_journald component_type=remap}: vector::utilization: utilization=0.00032005384566419215
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.442039Z DEBUG transform{component_kind="transform" component_id=trans_route_journald component_type=route}: vector::utilization: utilization=0.00016447879370967366
Jan 22 11:53:19 1.docker.staging vector[2093433]: 2024-01-22T10:53:19.442093Z DEBUG transform{component_kind="transform" component_id=trans_route_metrics component_type=route}: vector::utilization: utilization=0.00019945876138000252
Jan 22 11:53:20 1.docker.staging vector[2093433]: 2024-01-22T10:53:20.259681Z DEBUG vector::internal_events::file::source: Files checkpointed. count=0 duration_ms=0
Jan 22 11:53:20 1.docker.staging vector[2093433]: 2024-01-22T10:53:20.283190Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}: tower::balance::p2c::service: updating from discover
Jan 22 11:53:20 1.docker.staging vector[2093433]: 2024-01-22T10:53:20.283219Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}: tower::buffer::worker: service.ready=true processing request
Jan 22 11:53:20 1.docker.staging vector[2093433]: 2024-01-22T10:53:20.283295Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=83}:http: vector::internal_events::http_client: Sending HTTP request. uri=http://1.elasticsearch:9200/>
Jan 22 11:53:20 1.docker.staging vector[2093433]: 2024-01-22T10:53:20.297758Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=83}:http: vector::internal_events::http_client: HTTP response. status=200 OK version=HTTP/1.1 headers={"x-el>
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.182760Z DEBUG source{component_kind="source" component_id=src_pods component_type=file}:file_server: file_source::file_server: event_throughput=0.000/sec bytes_throughput=0.000/sec ratios={"discovery": 1.0435204e-5, "other": 3.877>
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.260173Z DEBUG vector::internal_events::file::source: Files checkpointed. count=0 duration_ms=0
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.284661Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}: tower::balance::p2c::service: updating from discover
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.284716Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}: tower::buffer::worker: service.ready=true processing request
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.284802Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=84}:http: vector::internal_events::http_client: Sending HTTP request. uri=http://1.elasticsearch:9200/>
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.284846Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=84}:http: hyper::client::pool: reuse idle connection for ("http", 1.elasticsearch:9200)
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.284935Z DEBUG hyper::proto::h1::io: flushed 18111 bytes
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.297790Z DEBUG hyper::proto::h1::io: parsed 4 headers
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.297806Z DEBUG hyper::proto::h1::conn: incoming body is content-length (480 bytes)
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.297821Z DEBUG hyper::proto::h1::conn: incoming body completed
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.297843Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=84}:http: hyper::client::pool: pooling idle connection for ("http", 1.elasticsearch:9200)
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.297862Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=84}:http: vector::internal_events::http_client: HTTP response. status=200 OK version=HTTP/1.1 headers={"x-el>
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.360218Z DEBUG hyper::proto::h1::io: parsed 7 headers
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.360236Z DEBUG hyper::proto::h1::conn: incoming body is content-length (98 bytes)
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.360254Z DEBUG hyper::proto::h1::conn: incoming body completed
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.360337Z ERROR source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::internal_events::docker_logs: Internal log [Error in communication with Docker daemon.] is being suppressed to avoid floodin>
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.360358Z  INFO source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::internal_events::docker_logs: Stopped watching for container logs. container_id=caf3f9b2abb503d7fa994443242298e5e2bd4d59a618>
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.360535Z DEBUG hyper::proto::h1::io: parsed 7 headers
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.360547Z DEBUG hyper::proto::h1::conn: incoming body is content-length (98 bytes)
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.360560Z DEBUG hyper::proto::h1::conn: incoming body completed
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.360596Z  INFO source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::internal_events::docker_logs: Stopped watching for container logs. container_id=eb474aca3302de26a675f79b4a724093107200506ca7>
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.360622Z DEBUG source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::topology::builder: Source pump finished normally.
Jan 22 11:53:21 1.docker.staging vector[2093433]: 2024-01-22T10:53:21.360643Z DEBUG source{component_kind="source" component_id=src_docker component_type=docker_logs}: vector::topology::builder: Source pump supervisor task finished normally.
Jan 22 11:53:22 1.docker.staging vector[2093433]: 2024-01-22T10:53:22.156171Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}: vector::utilization: utilization=0.00016935877375468607
Jan 22 11:53:22 1.docker.staging vector[2093433]: 2024-01-22T10:53:22.156218Z DEBUG sink{component_kind="sink" component_id=sink_metrics_exporter component_type=prometheus_exporter}: vector::utilization: utilization=0.0005702461767482766
Jan 22 11:53:22 1.docker.staging vector[2093433]: 2024-01-22T10:53:22.261494Z DEBUG vector::internal_events::file::source: Files checkpointed. count=0 duration_ms=0
Jan 22 11:53:22 1.docker.staging vector[2093433]: 2024-01-22T10:53:22.261612Z DEBUG transform{component_kind="transform" component_id=trans_remap_vector component_type=remap}: vector::utilization: utilization=0.00021420491758328953
Jan 22 11:53:22 1.docker.staging vector[2093433]: 2024-01-22T10:53:22.286408Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}: tower::balance::p2c::service: updating from discover
Jan 22 11:53:22 1.docker.staging vector[2093433]: 2024-01-22T10:53:22.286438Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}: tower::buffer::worker: service.ready=true processing request
Jan 22 11:53:22 1.docker.staging vector[2093433]: 2024-01-22T10:53:22.286483Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=85}:http: vector::internal_events::http_client: Sending HTTP request. uri=http://3.elasticsearch:9200/>
Jan 22 11:53:22 1.docker.staging vector[2093433]: 2024-01-22T10:53:22.286508Z DEBUG sink{component_kind="sink" component_id=sink_elasticsearch component_type=elasticsearch}:request{request_id=85}:http: hyper::client::pool: reuse idle connection for ("http", 3.elasticsearch:9200)

Example Data

No response

Additional Context

OS: Ubuntu 22.04.3 LTS

docker --version
Docker version 25.0.0, build e758fe5

We run docker in swarm mode.

References

No response

@dgo- dgo- added the type: bug A code related bug. label Jan 22, 2024
@jszwedko
Copy link
Member

Hi @dgo- !

Thanks for filing this. I think it is a duplicate of #16806 so I'll roll it into that, but feel free to follow that issue instead / leave additional details there.

@jszwedko jszwedko closed this as not planned Won't fix, can't repro, duplicate, stale Jan 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug A code related bug.
Projects
None yet
Development

No branches or pull requests

2 participants