-
Issues encounteredThey weren't mentioned in upgrade guide so documenting it here for there record.
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
@mhratson 👋🏻 Sorry about that. First and foremost, I opened an issue (#10128) for getting the upgrade guide fixed to properly call out the changes the batch size limits for the Datadog Logs sink, as well as other sinks. For just a small bit of context: the mismatch between the configurable maximum and what the Datadog documentation calls out is due to our internal mechanism for constructing batches. Our imposed limit is lower so that we avoid generating payloads that are too big and would be requested by the Datadog API. We actually have an issue open around this (#10020) because it can be confusing to users who think the batch size is equivalent to the payload size that Vector sends over the wire, when in reality, the batch size is an internal Vector concern around memory usage. In practice, the payloads usually are around the same size as the batches, but the current behavior can lead to undersized (most common) or oversized payloads (rare but possible). |
Beta Was this translation helpful? Give feedback.
@mhratson 👋🏻
Sorry about that. First and foremost, I opened an issue (#10128) for getting the upgrade guide fixed to properly call out the changes the batch size limits for the Datadog Logs sink, as well as other sinks.
For just a small bit of context: the mismatch between the configurable maximum and what the Datadog documentation calls out is due to our internal mechanism for constructing batches. Our imposed limit is lower so that we avoid generating payloads that are too big and would be requested by the Datadog API.
We actually have an issue open around this (#10020) because it can be confusing to users who think the batch size is equivalent to the payload size that Vector sends over…