I need to parse the messages from confluent Kafka stored in Avro. But while applying filter it is giving null pointer exception, without filter i was able to write back into kafka but while applying filter, it is giving null pointer exception. Here is the Schema: Answer When you have null vallue in your topic for col3 and you trying
Tag: confluent-platform
KsqlClientException: Received 401 response from server: Unauthorized. Error code: 40100
Trying to connect to Confluent hosted KSQL db. And i get an exception during client.listStreams().get(): What am i missing here? Answer Did you check the API_KEY/API_SECRET on your Confluent Cloud cluster to see if it is granted access to KSQLDB?
Strategy to choose when doing Serialization and Deserialization using spring-kafka library
I need to Serialize or Deserialize any type of Java Object may be Integer/ String or <T> or User or Account in my project. There might be more than 1 type I am not sure which one to use while configuring a Kafka Producer and Consumer. There are JsonSerializer and JsonDeserializer and StringSerializer/DESer and many more types. I have read
Kafka Dependencies – ccs vs ce
To develop my Kafka connector I need to add a connect-API dependency. Which one I should use? For example mongodb connector use connect-api from maven central But links from dev guide go to https://packages.confluent.io/maven/org/apache/kafka/connect-api/5.5.0-ccs/ and beside 5.5.0-ccs there is also 5.5.0-ce version. So, at this moment last versions are: 2.5.0 from maven central 5.5.0-ccs from packages.confluent.io/maven 5.5.0-ce from packages.confluent.io/maven What