0
0
Kafkadevops~30 mins

Why advanced patterns handle complex flows in Kafka - See It in Action

Choose your learning style9 modes available
Why advanced patterns handle complex flows
📖 Scenario: You are working in a company that processes orders using Apache Kafka. Orders come from different sources and need to be routed, filtered, and enriched before final processing. Simple Kafka topics and consumers are not enough to handle this complex flow.
🎯 Goal: Build a Kafka Streams application that uses advanced patterns like branching, filtering, and joining streams to handle complex order processing flows.
📋 What You'll Learn
Create a Kafka Streams topology with multiple branches
Filter orders based on status
Join order streams with customer data
Output the processed orders to a final topic
💡 Why This Matters
🌍 Real World
Companies use Kafka Streams to process and enrich real-time data from multiple sources, enabling complex workflows like order processing, fraud detection, and monitoring.
💼 Career
Understanding advanced Kafka patterns is essential for roles like DevOps engineers, data engineers, and backend developers working on scalable event-driven systems.
Progress0 / 4 steps
1
Create initial Kafka Streams topology with input topic
Create a Kafka Streams StreamsBuilder instance called builder and define a stream called ordersStream from the input topic orders.
Kafka
Need a hint?

Use new StreamsBuilder() to create the builder. Use builder.stream("orders") to create the stream.

2
Add filtering configuration for order status
Create a String variable called statusFilter and set it to "COMPLETED". This will be used to filter orders by their status.
Kafka
Need a hint?

Declare a String variable named statusFilter and assign it the value "COMPLETED".

3
Filter orders and join with customer data stream
Use ordersStream.filter() with a lambda that keeps only orders where the value contains statusFilter. Then create a KTable called customersTable from the topic customers. Finally, join the filtered orders stream with customersTable on the key using leftJoin.
Kafka
Need a hint?

Use filter on ordersStream with a lambda checking value.contains(statusFilter). Use builder.table("customers") to create customersTable. Use leftJoin to join filtered orders with customers.

4
Output the enriched orders to the final topic
Send the enrichedOrders stream to the Kafka topic processed-orders using to(). Then build the topology and print its description using builder.build().describe().
Kafka
Need a hint?

Use enrichedOrders.to("processed-orders") to send data. Use System.out.println(builder.build().describe()) to print the topology description.