0
0
Kafkadevops~30 mins

Event streaming concept in Kafka - Mini Project: Build & Apply

Choose your learning style9 modes available
Event Streaming Concept with Kafka
📖 Scenario: You are working for a small online store that wants to track customer orders in real-time. They want to send order events to a system where other parts of the store can react quickly, like updating inventory or sending confirmation emails.
🎯 Goal: Build a simple Kafka event streaming setup where you create a topic, send order events, and consume them to see the live stream of orders.
📋 What You'll Learn
Create a Kafka topic named orders
Produce three order events with exact keys and values
Consume the events from the orders topic
Print the consumed events to show the live stream
💡 Why This Matters
🌍 Real World
Many companies use Kafka to stream live data like orders, logs, or sensor readings so different parts of their system can react quickly.
💼 Career
Understanding Kafka event streaming is important for roles in data engineering, backend development, and real-time analytics.
Progress0 / 4 steps
1
Create the Kafka topic orders
Write the Kafka command to create a topic called orders with 1 partition and a replication factor of 1.
Kafka
Need a hint?

Use the kafka-topics command with --create and specify the topic name, partitions, and replication factor.

2
Produce three order events to the orders topic
Use the Kafka console producer to send these exact three events to the orders topic with keys and values separated by a comma:
order1,{'item':'book','quantity':1}
order2,{'item':'pen','quantity':3}
order3,{'item':'notebook','quantity':2}
Kafka
Need a hint?

Use kafka-console-producer with --property parse.key=true and --property key.separator=, to send key-value pairs.

3
Consume the events from the orders topic
Use the Kafka console consumer to read messages from the orders topic starting from the beginning. Include the message keys in the output.
Kafka
Need a hint?

Use kafka-console-consumer with --from-beginning and --property print.key=true to see keys and values.

4
Print the consumed events to show the live stream
Run the Kafka console consumer command you wrote in Step 3 and show the output exactly as it appears, including keys and values.
Kafka
Need a hint?

The output shows each key and its JSON value separated by a tab.