0
0
KafkaConceptBeginner · 3 min read

What is Message Broker in Kafka: Simple Explanation and Example

Apache Kafka acts as a message broker by receiving, storing, and forwarding messages between producers and consumers. It ensures reliable, scalable, and fast communication by organizing messages into topics and partitions.
⚙️

How It Works

Think of Kafka as a post office for data. Producers send messages like letters to Kafka, which sorts and stores them in topics, similar to mailboxes. Consumers then pick up these messages at their own pace.

This system allows many producers and consumers to communicate without knowing each other directly. Kafka keeps messages safe and ordered, so consumers get them reliably even if they are offline for a while.

💻

Example

This example shows a simple Kafka producer sending a message and a consumer receiving it using the Kafka console tools.

bash
bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test-topic
Hello Kafka

bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test-topic --from-beginning --max-messages 1
Output
Hello Kafka
🎯

When to Use

Use Kafka as a message broker when you need to connect many applications or services that produce and consume data independently. It is great for real-time data streaming, event tracking, log aggregation, and building scalable data pipelines.

For example, an online store can use Kafka to send order events from the website to inventory, shipping, and billing systems without delay or data loss.

Key Points

  • Message broker: Kafka routes messages between producers and consumers.
  • Topics: Messages are organized in named categories called topics.
  • Durability: Kafka stores messages on disk for reliability.
  • Scalability: Kafka handles large data volumes with partitions and clusters.
  • Decoupling: Producers and consumers work independently without direct connection.

Key Takeaways

Kafka is a message broker that reliably routes messages between producers and consumers.
It organizes messages into topics and stores them durably for scalable communication.
Kafka decouples data producers and consumers, allowing independent operation.
Use Kafka for real-time data streaming, event processing, and building data pipelines.