0
0
Kafkadevops~20 mins

JSON Schema and Protobuf support in Kafka - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Kafka Schema Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
What is the output of this Kafka JSON Schema validation code?

Consider a Kafka producer configured to use JSON Schema for message validation. The schema requires a field age of type int and name of type string. What happens when the following message is sent?

Kafka
message = {"name": "Alice", "age": "twenty"}  # age is a string, not int
# Producer sends message to topic with JSON Schema validation enabled
AThe message is rejected with a schema validation error because 'age' is not an integer.
BThe message is accepted and sent successfully despite the type mismatch.
CThe producer crashes with a runtime error due to type mismatch.
DThe message is sent but the 'age' field is automatically converted to integer 0.
Attempts:
2 left
💡 Hint

JSON Schema validation enforces field types strictly before sending.

Predict Output
intermediate
2:00remaining
What is the output of this Protobuf message deserialization in Kafka consumer?

A Kafka consumer receives a Protobuf-encoded message with a field id (int32) and active (bool). The message bytes are corrupted and missing the active field. What happens when the consumer tries to deserialize?

Kafka
message_bytes = b'\x08\x96\x01'  # Only 'id' field present
# Consumer deserializes using Protobuf schema expecting 'id' and 'active'
AThe consumer successfully deserializes with 'active' set to its default value false.
BThe consumer throws a deserialization error due to missing 'active' field.
CThe consumer sets 'active' to true by default.
DThe consumer ignores the message and skips processing.
Attempts:
2 left
💡 Hint

Protobuf fields can be optional and have default values.

🧠 Conceptual
advanced
2:00remaining
Which Kafka schema registry feature supports both JSON Schema and Protobuf?

Kafka Schema Registry supports multiple schema types. Which feature allows seamless use of both JSON Schema and Protobuf schemas in Kafka topics?

AConverting all schemas to Avro format before registration.
BUsing separate schema registries for JSON Schema and Protobuf.
CDisabling schema validation to allow any format.
DMulti-format schema registry with compatibility settings per schema type.
Attempts:
2 left
💡 Hint

Schema Registry supports multiple schema types with compatibility controls.

Predict Output
advanced
2:00remaining
What error does this Kafka Protobuf producer code raise?

Given this Kafka producer code snippet using Protobuf serialization, what error occurs?

Kafka
from kafka import KafkaProducer
from protobuf_example_pb2 import User

producer = KafkaProducer(value_serializer=lambda v: v.SerializeToString())
user = User(id=123)  # 'name' field is required but missing
producer.send('users', user)
AKafkaTimeoutError because producer cannot connect.
BNo error; message is sent with missing 'name' field as empty string.
CRuntimeError due to missing required 'name' field in Protobuf message.
DTypeError because 'user' is not bytes.
Attempts:
2 left
💡 Hint

Protobuf required fields must be set before serialization.

🧠 Conceptual
expert
3:00remaining
How does Kafka handle schema evolution with JSON Schema and Protobuf?

In Kafka, when evolving schemas for topics using JSON Schema and Protobuf, which statement best describes the compatibility guarantees?

AOnly Protobuf supports schema evolution; JSON Schema requires full schema replacement.
BBoth JSON Schema and Protobuf support backward and forward compatibility controlled by Schema Registry settings.
CJSON Schema supports compatibility but Protobuf does not support schema evolution.
DNeither JSON Schema nor Protobuf support schema evolution in Kafka.
Attempts:
2 left
💡 Hint

Schema Registry manages compatibility for multiple schema types.