-
Notifications
You must be signed in to change notification settings - Fork 440
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: opentelemetry-appender-tracing
should respect information inside tracing::Span
#1378
Comments
The logs should be correlated automatically.. Are you not seeing that behavior? |
Yes - this should be correlated automatically, as long as the application is instrumented using opentelemetry tracing (either through tracing-opentelemetry or directly using opentelemetry tracing api). Not if it is using any other (non-otel) tracing subscriber. |
I can confirm things are properly instrumented with |
Can you use stdout exporter and share the output? We just want to confirm if this issue is with OTLP or Loki itself. |
Or just the log output from otel collector |
This code somehow results in no output... let exporter = opentelemetry_stdout::LogExporter::default();
let logs = LoggerProvider::builder()
.with_simple_exporter(exporter)
.build();
let logging = OpenTelemetryTracingBridge::new(&logs); |
collector logs are scarce
|
Follow this example as-is https://github.com/open-telemetry/opentelemetry-rust/blob/main/opentelemetry-otlp/examples/basic-otlp/README.md#manual You should see logs correctly correlated to traces |
Do you know what's going on here? Something about that is blocking all output on the tracing subscriber. |
I should have been more precise, apologies. You need "logging" exporter in collector to see the actual data it receives. |
Start with the example from the repo. That should work as-is. And you can tweak it to match your setup. |
debug logs
|
Looks like the trace ids are being omitted. |
@ewoolsey Could you start with the above example (which we know is working as expected), and progressively make changes to it to match your setup? That'll help us identify root-cause quickly. |
I just did, and that example doesn't emit a trace_id either, even when inside a span. code use opentelemetry::KeyValue;
use opentelemetry_appender_tracing::layer;
use opentelemetry_sdk::{
logs::{Config, LoggerProvider},
Resource,
};
use tracing::{error, span};
use tracing_core::Level;
use tracing_subscriber::prelude::*;
fn main() {
let exporter = opentelemetry_stdout::LogExporter::default();
let provider: LoggerProvider = LoggerProvider::builder()
.with_config(
Config::default().with_resource(Resource::new(vec![KeyValue::new(
"service.name",
"log-appender-tracing-example",
)])),
)
.with_simple_exporter(exporter)
.build();
let layer = layer::OpenTelemetryTracingBridge::new(&provider);
tracing_subscriber::registry().with(layer).init();
let my_span = span!(Level::TRACE, "my_span");
// `my_span` exists but has not been entered.
// Enter `my_span`...
let _enter = my_span.enter();
error!(name: "my-event-name", target: "my-system", event_id = 20, user_name = "otel", user_email = "[email protected]");
drop(provider);
} output
|
I don't see the tracing-opentelemetry subscriber configured in the above example before emitting spans. |
ah right I just blindly ran that example since you linked it. Anyhow, I found the root cause. This does not emit a trace id #[instrument]
pub async fn my_func() {
trace!("fun log");
} But this does #[instrument]
pub async fn my_func() {
let tracer = opentelemetry::global::tracer("ex.com/basic");
tracer.in_span("cool span", |_| {
trace!("fun log");
});
} Should |
That is a question for tracing-opentelemetry repo! It should have worked, but might need additional setups/config, which are better asked in tracing-opentelemetry repo. |
I can't have been the only person to run into this, seems like a pretty common use case. I suppose I can open up an issue over there. Thanks so much for your help talking through this! Appreciate your time. I'll mark this issue as closed here. |
Sorry I think I need to reopen this. I'm pretty sure the issue lies with the |
|
Yeah I think that's an accurate description. |
trace_id
in opentelemetry-appender-tracing
opentelemetry-appender-tracing
should respect information inside tracing::Span
I changed the name of this issue to reflect what I think should be the expected behaviour. |
@lalitb I need to try and get this going for my work, would you be able to help me out with a quick workaround? Within the impl<'a, S, P, L> Layer<S> for OpenTelemetryTracingBridge<P, L>
where
for<'lookup> S: tracing_subscriber::registry::LookupSpan<'lookup>,
S: tracing::Subscriber,
P: LoggerProvider<Logger = L> + Send + Sync + 'static,
L: Logger + Send + Sync + 'static,
{
fn on_event(&self, event: &tracing::Event<'_>, ctx: tracing_subscriber::layer::Context<'_, S>) {
let meta = event.metadata();
let mut log_record: LogRecord = LogRecord::default();
log_record.severity_number = Some(severity_of_level(meta.level()));
log_record.severity_text = Some(meta.level().to_string().into());
if let Some(current_span) = ctx.lookup_current() {
if let Some(mut otel_data) = current_span.extensions_mut().remove::<OtelData>() {
log_record.with_context(otel_data.parent_cx);
}
}
... |
@ewoolsey Also, please remember this will call both |
Oh that's amazing I didn't realize a PR was already being worked on! I'm sure that'll be exactly what I need. Thank you! |
Has there been any progress on this? Even just a workaround? not been able to find any good examples for implementing and have tried 5 or so different possible solutions to no avail. |
Is there an accepted way to correlate logs & traces when using If I only use If anyone has some working code they can share that correlates log events with traces I'd very much appreciate it. |
Ping to #1689 |
@ewoolsey did you manage to find a workaround for this or forked the library somehow? Would love to piggyback on yours :-D |
I also stumbled over this issue and assumed that the tracing instrument macros work with the opentelemetry framework. Before
After
'LogRecord' got trace and span ids now If someone would like to test it, append this to the Cargo.toml file: [dependencies]
# ...
opentelemetry-tracing = { version = "0.25" }
[patch.crates-io]
opentelemetry = { package = "opentelemetry", git = "https://github.com/jpramosi/opentelemetry-rust.git" }
opentelemetry-appender-tracing = { package = "opentelemetry-appender-tracing", git = "https://github.com/jpramosi/opentelemetry-rust.git" }
opentelemetry-http = { package = "opentelemetry-http", git = "https://github.com/jpramosi/opentelemetry-rust.git" }
opentelemetry-otlp = { package = "opentelemetry-otlp", git = "https://github.com/jpramosi/opentelemetry-rust.git" }
opentelemetry_sdk = { package = "opentelemetry_sdk", git = "https://github.com/jpramosi/opentelemetry-rust.git" }
opentelemetry-tracing = { package = "opentelemetry-tracing", git = "https://github.com/jpramosi/opentelemetry-rust.git" } The fork includes tracing-opentelemetry but was renamed to 'opentelemetry-tracing'. So it needs to be refactored in your code or it doesn't work at all. A working example can be seen here. |
This is amazing! Thank you for making the changes and sharing your code. It's working for me. I got it working after doing the switch from |
It's great to be able to correlate logs with traces as is possible with the Grafana Tempo+Loki combo. This could be some sort of configuration option likely.
The text was updated successfully, but these errors were encountered: