Replies: 1 comment 1 reply
-
@zlepper Have you already tried compression? opentelemetry-rust/opentelemetry-otlp/src/exporter/tonic/mod.rs Lines 428 to 430 in 8b3fc06 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
We have made a little agent that we use to gather metric some several systems that doesn't natively speak OTEL, which we then convert into OTEL metrics and send to the OTEL collector using the
TonicMetricsClient
.Currently we are encountering an issue where some of those batches gets quite large (At the time of writing 17 MB) for just a single resource with only 28 "metrics", however with an extremely large amount of extra attributes and a lot of datapoints.
I have attempted to write some code that could split things a bit apart in general before its send of to the collector, however that doesn't seem to work very well as I don't know the actual size of the final request, so either I end up splitting things way too much (Such as 1 metric per request), which is extremely ineffecient for some of the other systems where we have a lot more metrics, but the final batch is still considerably smaller.
We have already increased the request limit in the collector once, and i'm worried that it's just going to be "whack-a-mole" of constantly increasing the size if we don't start doing something a bit smarter on the export size. I mean, those limits are there for a reason right?
The error we get when the batch is too large is this:
So to my actual question: Would this be something that could be done in this library which seems to have a bit more internal control over things, or should I keep seeing if I can come up with something we can do on our "consumer" side?
Beta Was this translation helpful? Give feedback.
All reactions