Bird
0
0

Why does increasing batch.size sometimes reduce compression efficiency in Kafka producers?

hard📝 Conceptual Q10 of 15
Kafka - Performance Tuning
Why does increasing batch.size sometimes reduce compression efficiency in Kafka producers?
ABecause batch.size is unrelated to compression.
BBecause batch.size controls compression codec selection.
CBecause larger batch.size disables compression.
DBecause larger batches may contain more diverse data, reducing compression ratio.
Step-by-Step Solution
Solution:
  1. Step 1: Understand batch size effect on data similarity

    Larger batches may combine more varied messages, which can reduce the effectiveness of compression algorithms that rely on data similarity.
  2. Step 2: Clarify batch.size role

    batch.size does not select codec or disable compression; it only controls batch byte size.
  3. Final Answer:

    Because larger batches may contain more diverse data, reducing compression ratio. -> Option D
  4. Quick Check:

    More diverse data in batch = lower compression ratio [OK]
Quick Trick: Bigger batch can mean less similar data, hurting compression [OK]
Common Mistakes:
  • Thinking batch.size selects compression codec
  • Assuming batch.size disables compression
  • Ignoring data diversity impact on compression

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More Kafka Quizzes