Replies: 1 comment
-
Kafka is designed for handling small events. Not 200MB files. While you can probably hack around the Kafka configuration to make it accept 200MB records, you would lose a lot of efficiency / performance etc. One of the ways how you can handle it is to upload the data to object storage such as S3 and send only a reference / link to it in the Kafka message. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi All- There is a requirement that we are addressing where the incoming source data ranges between 20KB to 200MB, We are using Kafka as a streaming data platform for this. So i am not sure if a topic with/without partitions can hold the 200MB data.
So would like to understand the recommendations here on how to configure/manage topics to ensure there is no latency & proper offset is maintained?, Basically any guidelines for such kind of use cases
Beta Was this translation helpful? Give feedback.
All reactions