Skip to content
Advertisement

Tag: avro

Avro schema file references in a Avro schema

I have nearly 100 avsc files and most of these avsc files refer to another asvc file usually as their type. To give an example Item.avsc in ./com/example/common ItemId.avsc in ./com/example/common Features.avsc in ./com/example/common When I want to parse the schema of Item.avsc it raises: I found a workaround to this problem by using a single instance of a parser

AvroParquetOutputFormat – Unable to Write Arrays with Null Elements

I’m using v1.11.1 of the parquet-mr library as part of a Java application that takes Avro records and writes them into Parquet files using the AvroParquetOutputFormat. There are Avro records with array type fields that will have null elements, e.g. Here’s an example Avro schema: I’m trying to write the following record: I thought I could use the 3-level list

How to fix No processor claimed any of these annotations: org.apache.avro.specific.AvroGenerated?

I’m having this error: No processor claimed any of these annotations: org.apache.avro.specific.AvroGenerated How I got the error: I was trying to implement an Avro serialize and deserialize. So, I generated an Avro Class out of .avsc file using an sbt-avro plugin. The generated Avro Class has an annotation above @org.apache.avro.specific.AvroGenerated. I tried commenting it out, it didn’t solve the issue,

Deserialization of JSON and Avro without Schema

I have been trying to implement Avro deserialization without confluent schema registry. A quick research shows that I can embed the schema in header before sending the record to topic. But the schema again has to be serialized to bytes before embedding on the header which again makes this problematic. Is there anyway to achieve this? What are what are

How to use avro native decoder when using a PollableMessageSource input in spring cloud stream?

I’m using a PollableMessageSource input to read from a Kafka topic. Messages on that topic are in Avro. use-native-decoding was set to true when those messages were published. This is how I’m polling: pollableChannels is just an injected instance of this interface: After seeing that the TypeClassName is not being formed properly (it’s nested objects are set to null by

Avro logicalType String Date conversion to EPOCH timestamp-millis

I have below schema I intend to supply date to it, and have conversion convert the date to epoch mili. //Conversion code But this code doesnt work… I was expecting it to convert to timestamp milis (i hardcoded 123L in above sample). Any help? Answer Referring back to How to define a LogicalType in Avro. (java), I managed to solve

Is there a need of seperate Key Schema(other than a schema for fields) while writing Avro producer?

I am writing Java Avro Producer code for my project. I have registered an Avro schema for all the fields which need to be passed. My Registered Schema- { “name”: “Claim”, “type”: “record”, “namespace”: “com.schema.avro”, “fields”: [ ] } I am using “icn” field value as a key but I don’t have a key schema registered separately. I am not

Flink Avro Serialization shows “not serializable” error when working with GenericRecords

I’m really having a hard time making Flink to communicate properly with a running Kafka instance making use of an Avro schema from the Confluent Schema Registry (for both key and value). After a while of thinking and restructuring my programm, I was able to push my implementation so far: Producer Method GenericSerializer.java However, when I execute the Job, it

Avro – java.io.IOException: Not a data file

I am using https://github.com/allegro/json-avro-converter to convert my json message into an avro file. After calling the convertToAvro method I get a byte array: byte[] byteArrayJson. Then I am using the commons library from Apache: The file is created. When I try to reconvert it to json, using: I have created a Junit test and used convertToJson method (from the previous

Advertisement