Consider a Kafka producer configured to use JSON Schema for message validation. The schema requires a field age of type int and name of type string. What happens when the following message is sent?
message = {"name": "Alice", "age": "twenty"} # age is a string, not int
# Producer sends message to topic with JSON Schema validation enabledJSON Schema validation enforces field types strictly before sending.
Kafka JSON Schema validation rejects messages that do not conform to the schema. Since 'age' is a string instead of an integer, the message is rejected with a validation error.
A Kafka consumer receives a Protobuf-encoded message with a field id (int32) and active (bool). The message bytes are corrupted and missing the active field. What happens when the consumer tries to deserialize?
message_bytes = b'\x08\x96\x01' # Only 'id' field present # Consumer deserializes using Protobuf schema expecting 'id' and 'active'
Protobuf fields can be optional and have default values.
In Protobuf, missing fields are set to their default values during deserialization. Since 'active' is a boolean, it defaults to false.
Kafka Schema Registry supports multiple schema types. Which feature allows seamless use of both JSON Schema and Protobuf schemas in Kafka topics?
Schema Registry supports multiple schema types with compatibility controls.
The Kafka Schema Registry supports multiple schema types including JSON Schema and Protobuf. It manages compatibility settings independently for each schema type, enabling seamless use.
Given this Kafka producer code snippet using Protobuf serialization, what error occurs?
from kafka import KafkaProducer from protobuf_example_pb2 import User producer = KafkaProducer(value_serializer=lambda v: v.SerializeToString()) user = User(id=123) # 'name' field is required but missing producer.send('users', user)
Protobuf required fields must be set before serialization.
Protobuf messages with required fields missing cause runtime errors during serialization. Here, 'name' is required but missing, so serialization fails.
In Kafka, when evolving schemas for topics using JSON Schema and Protobuf, which statement best describes the compatibility guarantees?
Schema Registry manages compatibility for multiple schema types.
Kafka Schema Registry supports schema evolution for both JSON Schema and Protobuf. Compatibility modes (backward, forward, full) can be configured per schema to ensure safe evolution.