Trino gets protobuf or avro data from redis #17913
Replies: 1 comment
-
Yes, both the KAfka and REdis connectors require you to provide a table description file to be able to map from the Avro bytes to a table-like structure. Just the avro payload itself is not sufficient since it'll only deserialize into an object but how the object is mapped into a SQL table is not defined. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I use python to establish a connection with redis and write the serialized avro data to redis, and set the configuration file and dataschema path according to the format in the official document, but the error dataschema is null is always reported when querying.
I checked the source code and found that currently redis seems to support the parsing of avro data according to the specified format, but there is no variable defined to receive dataschema parameters at the corresponding interface, and the corresponding method and interface are defined in the Kafka object file. Does this mean that trino does not yet support getting avro data from redis and automatically parsing it?
The figure below is the corresponding redis configuration setting source code and kafka setting source code
Beta Was this translation helpful? Give feedback.
All reactions