Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Errors in traces merely showing error=true #2402

Open
abernix opened this issue Jan 13, 2023 — with Slack · 11 comments
Open

Errors in traces merely showing error=true #2402

abernix opened this issue Jan 13, 2023 — with Slack · 11 comments
Labels
component/open-telemetry OTLP, Datadog, Prometheus, etc. and the integrations around it.

Comments

Copy link
Member

abernix commented Jan 13, 2023

In some cases, when errors are encountered all that appears in Datadog's Errors is an error=true property on the span.

@abernix abernix changed the title Routertraces include Errors in traces merely showing error=true Jan 13, 2023
@Geal
Copy link
Contributor

Geal commented Jan 16, 2023

it looks like the issue here is simply that opentelemetry-datadog does not know how to send logs to datadog: open-telemetry/opentelemetry-rust-contrib#6

It looks like sending the spans to the ddog agent using OTLP works though: the error events appear in the span's metadata

@Meemaw
Copy link
Contributor

Meemaw commented Jan 24, 2023

I can confirm this behaviour when using otlp exporter sending to a datadog agent opentelemtry collector. All error spans just have the error=true and no context about the error whatsoever.

@Geal
Copy link
Contributor

Geal commented Jan 24, 2023

the opentelemetry events will appear under the events field of the span metadata:

Screenshot from 2023-01-16 14-41-33

That is not a great experience though, but for now we cannot do much about it, because the opentelemetry-datadog crate does not support Datadog's logs API (which is separated from the traces), and even Datadog's own agent pretends to understand OTLP but will not send the events as logs.

@mohitd11
Copy link

Is there any workaround for this?

@Geal
Copy link
Contributor

Geal commented Feb 22, 2023

Not yet unfortunately, I have not figured out how to correlate logs properly. A thing that could be tested: get logs from stdout, process them through vector to translate the otel trace ids to datadog trace ids, then send to datadog

@garypen garypen assigned garypen and unassigned Geal Mar 17, 2023
@abernix abernix assigned BrynCooke and unassigned garypen Jun 5, 2023
@chandrikas chandrikas assigned BrynCooke and unassigned BrynCooke Jun 5, 2023
@chandrikas
Copy link
Contributor

@BrynCooke, can you please check this is still an issue?

@BrynCooke
Copy link
Contributor

This is still an issue, however we cannot do anything about this and it needs to be fixed upstream.
It looks like Datadog are trying to migrate their collector to use code from the official otel Go codebase which hopefully will make exporting to Datadog via OTLP an option.

@abernix
Copy link
Member Author

abernix commented Aug 14, 2023

It's possible that #3537 might fix this? Thoughts?

@psbrandt
Copy link

psbrandt commented Nov 23, 2023

I am not seeing my events in the span "info" section in Datadog. I used to see them, but don't anymore. Did something change? Or maybe it's a Datadog issue? Is anyone else having the same experience?

EDIT: Coming back to answer my own question. This is the (relatively new) code that is preventing events from showing up in Datadog:

// we ignore events
if !meta.is_span() {
return false;
}

@BrynCooke
Copy link
Contributor

BrynCooke commented Dec 4, 2023

It looks like #2999 has compounded this issue, filtering out all events from spans. Opened #4321.

@Geal Geal mentioned this issue Jan 16, 2024
6 tasks
Geal added a commit that referenced this issue Jan 18, 2024
Fix #4321 
Fix #3872 (the regression appears in 1.30, when #2999 was merged)

related: #2402

This fixes a regression introduced in #2999, where events were not sent
anymore with traces
@bnjjj bnjjj added the component/open-telemetry OTLP, Datadog, Prometheus, etc. and the integrations around it. label Jan 24, 2024
@Geal
Copy link
Contributor

Geal commented Jan 31, 2024

this should be revisited now that #4486 is merged, it should be fixed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component/open-telemetry OTLP, Datadog, Prometheus, etc. and the integrations around it.
Projects
None yet
Development

No branches or pull requests

9 participants