0
0
Kafkadevops~20 mins

Idempotent producer in Kafka - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Kafka Idempotent Producer Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
What is the output when sending messages with idempotent producer enabled?
Consider the following Kafka producer configuration and code snippet. What will be the output in the Kafka topic after running this code?
Kafka
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("enable.idempotence", "true");

KafkaProducer<String, String> producer = new KafkaProducer<>(props);

for (int i = 0; i < 3; i++) {
    producer.send(new ProducerRecord<>("my-topic", "key1", "message"));
}
producer.flush();
producer.close();
AThe topic will contain 3 identical messages with key 'key1' and value 'message'.
BThe topic will contain 1 message with key 'key1' and value 'message' because idempotence prevents duplicates.
CThe topic will contain no messages because the producer is idempotent and suppresses all sends.
DThe code will throw a runtime exception because sending multiple messages with the same key is not allowed.
Attempts:
2 left
💡 Hint
Idempotent producer prevents duplicates caused by retries, not multiple sends in the same session.
Predict Output
intermediate
2:00remaining
What error occurs if 'enable.idempotence' is set to false but 'acks' is set to 'all'?
Given the following Kafka producer configuration, what will happen when sending messages?
Kafka
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("enable.idempotence", "false");
props.put("acks", "all");

KafkaProducer<String, String> producer = new KafkaProducer<>(props);
producer.send(new ProducerRecord<>("my-topic", "key", "value"));
producer.flush();
producer.close();
AThe producer sends messages but with a warning logged about inconsistent configuration.
BThe producer throws a ConfigException because 'acks=all' requires 'enable.idempotence=true'.
CThe producer silently enables idempotence despite the configuration to ensure data safety.
DThe producer sends messages successfully with full acknowledgment but without idempotence guarantees.
Attempts:
2 left
💡 Hint
Idempotence and acks are related but not strictly dependent in Kafka configuration.
🔧 Debug
advanced
2:00remaining
Why does this idempotent producer code cause duplicate messages after a network failure?
Examine the code below. Despite enabling idempotence, duplicate messages appear in the topic after a network failure. What is the cause?
Kafka
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("enable.idempotence", "true");

KafkaProducer<String, String> producer = new KafkaProducer<>(props);

for (int i = 0; i < 5; i++) {
    producer.send(new ProducerRecord<>("my-topic", "key", "message"));
    if (i == 2) {
        // Simulate network failure by closing producer abruptly
        producer.close();
        producer = new KafkaProducer<>(props);
    }
}
producer.flush();
producer.close();
AIdempotence only works within a single producer instance; closing and reopening resets sequence numbers causing duplicates.
BThe producer configuration is missing 'acks=all', so duplicates occur despite idempotence.
CThe topic is configured with multiple partitions, so duplicates are expected with idempotence enabled.
DThe key serializer is incorrect, causing messages to be treated as different keys and duplicated.
Attempts:
2 left
💡 Hint
Idempotence tracks sequence numbers per producer instance to avoid duplicates.
📝 Syntax
advanced
2:00remaining
Which Kafka producer configuration snippet correctly enables idempotence with safe retries?
Select the configuration snippet that correctly enables idempotence and configures retries safely.
Aprops.put("enable.idempotence", true); props.put("retries", -1); props.put("acks", "1");
Bprops.put("enable.idempotence", "true"); props.put("retries", Integer.MAX_VALUE); props.put("acks", "all");
Cprops.put("enable.idempotence", "false"); props.put("retries", 5); props.put("acks", "all");
Dprops.put("enable.idempotence", "true"); props.put("retries", 0); props.put("acks", "all");
Attempts:
2 left
💡 Hint
Idempotence requires 'acks=all' and retries enabled to be effective.
🚀 Application
expert
3:00remaining
How to ensure exactly-once semantics in Kafka producer with idempotence enabled?
You want to guarantee that messages are delivered exactly once to a Kafka topic using an idempotent producer. Which approach below correctly achieves this?
AEnable idempotence and set 'acks' to '1' to reduce latency and rely on retries for duplicates.
BDisable idempotence but use manual offset commits in the consumer to avoid duplicates.
CEnable idempotence, set 'acks' to 'all', configure infinite retries, and use transactions to commit messages atomically.
DEnable idempotence and set 'retries' to 0 to avoid resending messages and duplicates.
Attempts:
2 left
💡 Hint
Exactly-once semantics require atomic writes and idempotent producer with proper acknowledgments.