Skip to content
Advertisement

Tag: confluent-schema-registry

Deserialization of JSON and Avro without Schema

I have been trying to implement Avro deserialization without confluent schema registry. A quick research shows that I can embed the schema in header before sending the record to topic. But the schema again has to be serialized to bytes before embedding on the header which again makes this problematic. Is there anyway to achieve this? What are what are

How to use avro native decoder when using a PollableMessageSource input in spring cloud stream?

I’m using a PollableMessageSource input to read from a Kafka topic. Messages on that topic are in Avro. use-native-decoding was set to true when those messages were published. This is how I’m polling: pollableChannels is just an injected instance of this interface: After seeing that the TypeClassName is not being formed properly (it’s nested objects are set to null by

Is there a need of seperate Key Schema(other than a schema for fields) while writing Avro producer?

I am writing Java Avro Producer code for my project. I have registered an Avro schema for all the fields which need to be passed. My Registered Schema- { “name”: “Claim”, “type”: “record”, “namespace”: “com.schema.avro”, “fields”: [ ] } I am using “icn” field value as a key but I don’t have a key schema registered separately. I am not

Flink Avro Serialization shows “not serializable” error when working with GenericRecords

I’m really having a hard time making Flink to communicate properly with a running Kafka instance making use of an Avro schema from the Confluent Schema Registry (for both key and value). After a while of thinking and restructuring my programm, I was able to push my implementation so far: Producer Method GenericSerializer.java However, when I execute the Job, it

Advertisement