0
0
Kafkadevops~30 mins

Schema compatibility rules in Kafka - Mini Project: Build & Apply

Choose your learning style9 modes available
Understanding Schema Compatibility Rules in Kafka
📖 Scenario: You are working with Apache Kafka and using Avro schemas to manage your data formats. You want to ensure that when you update your data schema, it remains compatible with previous versions so that your Kafka consumers do not break.
🎯 Goal: Build a simple example to understand how schema compatibility rules work in Kafka by defining schemas and checking compatibility.
📋 What You'll Learn
Create an initial Avro schema as a JSON string
Define a compatibility rule variable
Write a function to check if a new schema is compatible with the old schema based on the rule
Print the compatibility check result
💡 Why This Matters
🌍 Real World
In real Kafka projects, schema compatibility ensures that producers and consumers can evolve independently without breaking data processing.
💼 Career
Understanding schema compatibility is essential for data engineers and backend developers working with Kafka and schema registries to maintain reliable data pipelines.
Progress0 / 4 steps
1
Create the initial Avro schema
Create a variable called old_schema and assign it this exact JSON string representing an Avro schema: {"type": "record", "name": "User", "fields": [{"name": "name", "type": "string"}, {"name": "age", "type": "int"}]}
Kafka
Need a hint?

Use a string variable to store the JSON schema exactly as shown.

2
Define the compatibility rule
Create a variable called compatibility_rule and set it to the string BACKWARD to represent the schema compatibility mode.
Kafka
Need a hint?

Set the compatibility rule variable to the exact string 'BACKWARD'.

3
Write a function to check schema compatibility
Write a function called is_compatible that takes two parameters: old_schema and new_schema. Inside the function, return True if compatibility_rule is BACKWARD and the new schema adds a new optional field called email. Otherwise, return False. Use simple string checks to simulate compatibility.
Kafka
Need a hint?

Check if the compatibility rule is 'BACKWARD' and if the new schema string contains the field 'email'.

4
Check and print schema compatibility
Create a variable called new_schema with this exact JSON string: {"type": "record", "name": "User", "fields": [{"name": "name", "type": "string"}, {"name": "age", "type": "int"}, {"name": "email", "type": ["null", "string"], "default": null}]}. Then print the result of calling is_compatible(old_schema, new_schema).
Kafka
Need a hint?

Use the exact new schema string and print the result of the compatibility function.