0
0
Kafkadevops~20 mins

Producer retries and idempotency in Kafka - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Kafka Producer Idempotency Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
What is the output when a Kafka producer with idempotency enabled retries sending a message?
Consider a Kafka producer configured with enable.idempotence=true and retries=3. If the producer sends a message and a transient network error occurs causing a retry, what will be the effect on the topic?
Kafka
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("enable.idempotence", "true");
props.put("retries", 3);

KafkaProducer<String, String> producer = new KafkaProducer<>(props);

ProducerRecord<String, String> record = new ProducerRecord<>("my-topic", "key1", "value1");
producer.send(record);
producer.close();
AThe message will be lost if a retry happens because idempotency disables retries.
BThe message will be delivered exactly once to the topic, even if retries occur.
CThe message may be duplicated in the topic because retries resend the message multiple times.
DThe producer will throw an exception and stop sending messages on the first retry.
Attempts:
2 left
💡 Hint
Idempotency ensures no duplicate messages even if retries happen.
🧠 Conceptual
intermediate
1:30remaining
Why is idempotency important when enabling retries in Kafka producers?
Select the best explanation for why enabling idempotency is critical when a Kafka producer is configured to retry sending messages.
AIdempotency prevents duplicate messages caused by retries, ensuring message uniqueness.
BIdempotency increases throughput by batching messages during retries.
CIdempotency disables retries to avoid message duplication.
DIdempotency encrypts messages to secure them during retries.
Attempts:
2 left
💡 Hint
Think about what happens if the same message is sent multiple times.
🔧 Debug
advanced
2:00remaining
Identify the error in this Kafka producer configuration for retries and idempotency
Given the following Kafka producer configuration, what error will occur when sending messages with retries and idempotency enabled?
Kafka
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("enable.idempotence", "true");
props.put("retries", -1);

KafkaProducer<String, String> producer = new KafkaProducer<>(props);

ProducerRecord<String, String> record = new ProducerRecord<>("my-topic", "key1", "value1");
producer.send(record);
producer.close();
AThe producer will disable idempotency automatically due to invalid retries.
BThe producer will ignore the retries setting and send messages only once.
CThe producer will send infinite retries without error.
DThe producer will throw an IllegalArgumentException because retries cannot be negative.
Attempts:
2 left
💡 Hint
Check the allowed values for the retries configuration.
Predict Output
advanced
2:00remaining
What happens if a Kafka producer with idempotency enabled sends messages with different keys but same value during retries?
Consider a Kafka producer with enable.idempotence=true sending two messages with the same value but different keys. If a retry occurs for one message, what will be the result in the topic?
Kafka
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("enable.idempotence", "true");
props.put("retries", 3);

KafkaProducer<String, String> producer = new KafkaProducer<>(props);

producer.send(new ProducerRecord<>("my-topic", "key1", "value"));
producer.send(new ProducerRecord<>("my-topic", "key2", "value"));
producer.close();
ABoth messages will be delivered exactly once, even if retries happen, because keys differ.
BThe message with the same value will be deduplicated and only one will appear in the topic.
CThe producer will fail to send the second message due to duplicate value.
DRetries will cause duplicate messages for both keys because idempotency only works per value.
Attempts:
2 left
💡 Hint
Idempotency works per producer sequence, not by message content.
🧠 Conceptual
expert
2:30remaining
How does Kafka ensure exactly-once delivery semantics with producer retries and idempotency enabled?
Which mechanism best describes how Kafka achieves exactly-once delivery when a producer has retries and idempotency enabled?
AKafka stores all messages in a temporary buffer and only commits them after all retries succeed.
BKafka disables retries and uses transactional commits to guarantee exactly-once delivery.
CKafka assigns a unique producer ID and sequence number to each message, allowing the broker to detect and discard duplicates during retries.
DKafka encrypts messages with unique keys to prevent duplicates during retries.
Attempts:
2 left
💡 Hint
Think about how the broker can recognize duplicate messages from the same producer.