0
0
Kafkadevops~10 mins

Event streaming concept in Kafka - Step-by-Step Execution

Choose your learning style9 modes available
Process Flow - Event streaming concept
Event Produced
Event Sent to Kafka Topic
Kafka Stores Event in Partition
Consumer Reads Event from Partition
Consumer Processes Event
Acknowledgment Sent
Next Event Processed
Events are created and sent to Kafka topics, stored in partitions, then consumers read and process them in order.
Execution Sample
Kafka
producer.send('orders', {'order_id': 1, 'item': 'book'})
consumer.poll()
process(event)
consumer.commit()
A producer sends an order event to the 'orders' topic; the consumer polls, processes, and commits the event.
Process Table
StepActionEvent DataKafka StateConsumer StateOutput
1Producer sends event{order_id:1, item:'book'}Event stored in 'orders' partition 0No events read yetEvent queued
2Consumer polls topicN/AEvent still storedReads event from partition 0 offset 0Event received
3Consumer processes event{order_id:1, item:'book'}Event storedProcessing eventOrder processed
4Consumer commits offsetN/AEvent storedOffset 1 committedReady for next event
5No new eventsN/ANo new eventsNo events to readWaiting for events
💡 No new events to process, consumer waits for more.
Status Tracker
VariableStartAfter Step 1After Step 2After Step 3After Step 4Final
Kafka Topic 'orders'empty1 event stored1 event stored1 event stored1 event stored1 event stored
Consumer Offset000011
Event in Consumernoneevent(order_id=1)event(order_id=1)event(order_id=1)nonenone
Key Moments - 3 Insights
Why does the consumer commit the offset after processing?
Committing the offset (see step 4) tells Kafka the event was processed successfully, so the consumer won't re-read it.
What happens if the consumer polls but no new events exist?
As shown in step 5, the consumer waits without processing until new events arrive.
Is the event removed from Kafka after the consumer reads it?
No, Kafka keeps events for a set time or size limit; consumers track offsets to know what to read next.
Visual Quiz - 3 Questions
Test your understanding
Look at the execution table, what is the consumer offset after step 3?
A1
B0
CNot set
DUndefined
💡 Hint
Check the 'Consumer Offset' row in variable_tracker after step 3.
At which step does the consumer acknowledge it processed the event?
AStep 4
BStep 2
CStep 3
DStep 5
💡 Hint
Look for 'Offset committed' in the execution_table step 4.
If a new event arrives after step 5, what changes in the consumer state?
AConsumer offset resets to 0
BKafka deletes old events
CConsumer reads new event from next offset
DConsumer stops polling
💡 Hint
Refer to how consumer reads events from partitions in execution_table steps 2 and 5.
Concept Snapshot
Event streaming means sending data as events to a Kafka topic.
Kafka stores events in partitions.
Consumers read events in order using offsets.
Consumers commit offsets after processing.
Events remain in Kafka until retention expires.
Full Transcript
Event streaming with Kafka works by producers sending events to topics. Kafka stores these events in partitions. Consumers poll these partitions to read events in order. After processing, consumers commit their offsets to mark progress. Kafka keeps events for a retention period, allowing multiple consumers to read independently. This flow ensures reliable, ordered event processing.