Skip to content
Advertisement

Avro custom decoding of UUID through kafka on consumer end

I’ve written a class to custom encode objects of UUID type to bytes to be transported across kafka and avro.

To use this class, I put an @AvroEncode(using=UUIDAsBytesEncoding.class) above the uuid variable in my target object. (This is being implemented by the apache avro reflect library)

I’m having difficulty figuring out how to have my consumer automatically use the custom decoder. (or do I have to go in and manually decode it?).

Here is my UUIDAsBytesEncoder extends CustomEncoding class:

JavaScript

The write method and encoding work well, but the read method is never getting called on deserialization. How would I implement this in a consumer?

The schema on the registry looks like:

{“type”:”record”,”name”:”Request”,”namespace”:”xxxxxxx.xxx.xxx”,”fields”:[{“name”:”password”,”type”:”string”},{“name”:”email”,”type”:”string”},{“name”:”id”,”type”:[“null”,{“type”:”bytes”,”CustomEncoding”:”UUIDAsBytesEncoding”}],”default”:null}]} `

If the consumer can’t automatically use that information to use the UUIDAsBytesEncoding read method, then how would I find the data marked with that tag in my consumer?

I am using the confluent schema-registry as well.

Any help would be appreciated!

Advertisement

Answer

Ended up finding the solution. The encoding was incorrect– the built in writeBytes() method automatically writes the length for you.

Then in the consumer, we must do go to through a GenericDatumWriter, write to a binary stream, and then read from the binary stream with a ReflectDatumReader. This will automatically call the UUIAsBytesEncoding read() method and deserialize the UUID.

My consumer would look something like this (as part of a consumer group executor service walkthrough here):

JavaScript

Then the new UUIDAsBytesEncoding would look like:

JavaScript

This is also an example of how to implement a CustomEncoding avro class. The current version of avro does not have a UUID serializer built in, so this is a solution to that problem.

User contributions licensed under: CC BY-SA
2 People found this is helpful
Advertisement