I’ve setup kafka in my google cloud instance, and was able to run the commands for producer and consumer creation successfully. Now I’m trying to run this simple kafka producer code in java, however I’m not able to successfully execute the code after compilation. For compilation, works perfectly and the SimpleProducer.class is generated. However when I try the execution command,
Tag: apache-kafka
Error while uploading to S3 from Hadoop using s3a
I am running a java application on kubernetes which upload multiple files from local container to s3 bucket using s3a, but I am getting the below exception in logs and files are not getting uploaded to s3. Partial files are getting uploaded to s3 Answer Looks like this is a bug. problems caused by AWS SDK classes from the localstack-utils-fat.jar
Kafka Streams: SerializationException: Size of data received by LongDeserializer is not 8
I have a small app to count the number of colors using Apache Kafka – The topics are created and producers/ consumers are started using the terminal: I provided the following inputs into the terminal: I received the error in the consumer terminal: What’s the issue here? I did brief research and find similar questions asked by other people, but,
Spring Kafka Template – Connect to Kafka Topic on Spring Boot Startup
I have implemented a basic Spring Boot Application which uses Spring Kafka. I want my producer to connect to the Kafka Topic before the first .send() is called but I can’t find a way to do so. Is that possible? Logs to show that KafkaTemplate only connects to the Kafka Topic after I trigger the .send method at 16:12:44: Answer
Kafka Connect Error : java.lang.NoClassDefFoundError: org/apache/http/conn/HttpClientConnectionManager
I’m using docker with kafka and clickhouse. I want to connect ‘KsqlDB table’ and ‘clickhouse’ using ‘kafka connect’. So I referred to this document and modified ‘docker composite’. here is my docker-compose And This is Dockerfile-kafka-connect And I typed this command in ‘KsqlDB’. (‘S3_FINAL’ topic has AVRO format for both multi-key and values.) And It doesn’t work. So I read
How to split message into more messages
if I have a kstream with KVs like <String, List<myobject>> that I obtained from a KStream groupby + aggregate, is there a way to split each of the values of the List<> to get individual messages like <String, myobject>? I was hoping for something like flattening a list that would return individual messages with the same key, but I couldn’t
NoClassDefFoundError: scala/collection/convert/AsJavaExtensions
I am creating EmbeddedKafkaCluster in Java test, but getting the following exception, but I have added the kafka_2.12 depedencies which have scala depedencies. Java version: 11 Added the following depedencies Exception: Answer Upgrading to kafka_2.13 solved the issue.
Kafka – Re-process messages in topics
I have a Kafka cluster. At this moment just for testing, there is only one topic and to this topic, 1 consumer is taking the same messages from the topic and processing, and storing them on the database. But, if I have any problem storing on the database and throw an exception, for example, a PersistenceException, then the message flow
How to setup Kafka Idempotent Producer in Spring Boot?
We would like to store data in Kafka using exactly-once semantics in order to avoid message duplication. Producer with following properties: Kafka topic description: Integration test: Our expectation is that only one message should be stored in Kafka, but the actual result 3 messages. How can I make excatly-once semantics to work with Kafka? What is missing in my configuration?
How to split the string into different kafka topic based on some conditions
I am trying to split the string into different kafka topic based on conditions. Here is the topology. Split the string into words. Match every words with conditions (here set of Good words and set of Bad words) If atleast 1 words from Bad words set found in the string, it will be sent to Bad-string topic otherwise it will