Skip to content
Advertisement

Tag: apache-kafka

Executing simple kafka producer program in java

I’ve setup kafka in my google cloud instance, and was able to run the commands for producer and consumer creation successfully. Now I’m trying to run this simple kafka producer code in java, however I’m not able to successfully execute the code after compilation. For compilation, works perfectly and the SimpleProducer.class is generated. However when I try the execution command,

Error while uploading to S3 from Hadoop using s3a

I am running a java application on kubernetes which upload multiple files from local container to s3 bucket using s3a, but I am getting the below exception in logs and files are not getting uploaded to s3. Partial files are getting uploaded to s3 Answer Looks like this is a bug. problems caused by AWS SDK classes from the localstack-utils-fat.jar

Kafka Connect Error : java.lang.NoClassDefFoundError: org/apache/http/conn/HttpClientConnectionManager

I’m using docker with kafka and clickhouse. I want to connect ‘KsqlDB table’ and ‘clickhouse’ using ‘kafka connect’. So I referred to this document and modified ‘docker composite’. here is my docker-compose And This is Dockerfile-kafka-connect And I typed this command in ‘KsqlDB’. (‘S3_FINAL’ topic has AVRO format for both multi-key and values.) And It doesn’t work. So I read

How to split message into more messages

if I have a kstream with KVs like <String, List<myobject>> that I obtained from a KStream groupby + aggregate, is there a way to split each of the values of the List<> to get individual messages like <String, myobject>? I was hoping for something like flattening a list that would return individual messages with the same key, but I couldn’t

Kafka – Re-process messages in topics

I have a Kafka cluster. At this moment just for testing, there is only one topic and to this topic, 1 consumer is taking the same messages from the topic and processing, and storing them on the database. But, if I have any problem storing on the database and throw an exception, for example, a PersistenceException, then the message flow

How to setup Kafka Idempotent Producer in Spring Boot?

We would like to store data in Kafka using exactly-once semantics in order to avoid message duplication. Producer with following properties: Kafka topic description: Integration test: Our expectation is that only one message should be stored in Kafka, but the actual result 3 messages. How can I make excatly-once semantics to work with Kafka? What is missing in my configuration?

Advertisement