0
0
Kafkadevops~10 mins

KStream and KTable concepts in Kafka - Step-by-Step Execution

Choose your learning style9 modes available
Process Flow - KStream and KTable concepts
Start: Input Data
Process each event
Output transformed
Downstream apps
Data flows into Kafka as events. KStream processes each event individually as a stream. KTable keeps the latest state per key, updating as new events arrive.
Execution Sample
Kafka
KStream<String, String> stream = builder.stream("input-topic");
KTable<String, String> table = builder.table("input-topic");

stream.foreach((k,v) -> System.out.println("Stream event: " + k + ":" + v));
table.toStream().foreach((k,v) -> System.out.println("Table state: " + k + ":" + v));
This code reads the same Kafka topic as a stream and as a table, printing each event and the latest state per key.
Process Table
StepInput EventKStream ActionKTable ActionOutput
1key1: value1Process event key1:value1Update key1 state to value1Stream event: key1:value1 Table state: key1:value1
2key2: value2Process event key2:value2Update key2 state to value2Stream event: key2:value2 Table state: key2:value2
3key1: value3Process event key1:value3Update key1 state to value3 (overwrite)Stream event: key1:value3 Table state: key1:value3
4key3: value4Process event key3:value4Update key3 state to value4Stream event: key3:value4 Table state: key3:value4
5No more eventsStop processingStop updatingEnd of stream and table output
💡 No more input events; processing stops.
Status Tracker
VariableStartAfter 1After 2After 3After 4Final
KStream current eventnullkey1:value1key2:value2key1:value3key3:value4null
KTable state key1nullvalue1value1value3value3value3
KTable state key2nullnullvalue2value2value2value2
KTable state key3nullnullnullnullvalue4value4
Key Moments - 2 Insights
Why does KTable overwrite the value for key1 at step 3?
Because KTable represents the latest state per key, it updates the stored value for key1 from value1 to value3 when a new event with key1 arrives, as shown in execution_table row 3.
Does KStream keep previous events after processing?
No, KStream processes each event as it arrives and does not store state. Each event is handled independently, as seen in the 'KStream Action' column of the execution_table.
Visual Quiz - 3 Questions
Test your understanding
Look at the execution_table at step 2. What is the KTable state for key2?
Anull
Bvalue2
Cvalue1
Dvalue3
💡 Hint
Check the 'KTable Action' and 'Output' columns at step 2 in the execution_table.
At which step does the KTable update the state for key1 to value3?
AStep 1
BStep 2
CStep 3
DStep 4
💡 Hint
Look at the 'KTable Action' column for key1 updates in the execution_table.
If a new event with key2:value5 arrives after step 4, what will happen to the KTable state for key2?
AIt updates to value5
BIt stays as value2
CIt deletes key2
DIt duplicates value2
💡 Hint
KTable always keeps the latest state per key, updating on new events as shown in variable_tracker.
Concept Snapshot
KStream: Processes each event as a stream, no state kept.
KTable: Represents latest state per key, updates on new events.
Both read from Kafka topics but differ in state handling.
Use KStream for event processing, KTable for stateful views.
KTable updates overwrite previous values for the same key.
Full Transcript
This visual execution shows how Kafka Streams processes data as KStream and KTable. KStream handles each event individually, printing each as it arrives. KTable keeps the latest value per key, updating state when new events with the same key come in. The execution table traces each input event, showing KStream processing and KTable state updates. Variable tracking shows how KTable state changes per key over time. Key moments clarify why KTable overwrites values and how KStream differs by not storing state. The quiz tests understanding of state updates and event processing steps. This helps beginners see the difference between streaming events and maintaining state in Kafka Streams.