Challenge - 5 Problems
Kafka Transform Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of a Kafka Connect transform chain
Consider a Kafka Connect source connector with the following transform chain applied to each record:
Given a record with value:
What will be the value of the record after the transform chain?
{"transforms": "dropField,replaceValue", "transforms.dropField.type": "org.apache.kafka.connect.transforms.ReplaceField$Value", "transforms.dropField.blacklist": "password", "transforms.replaceValue.type": "org.apache.kafka.connect.transforms.ReplaceField$Value", "transforms.replaceValue.whitelist": "username"}Given a record with value:
{"username": "alice", "password": "secret", "email": "alice@example.com"}What will be the value of the record after the transform chain?
Attempts:
2 left
💡 Hint
Think about the order of transforms and what each one keeps or removes.
✗ Incorrect
The first transform drops the 'password' field, so the record becomes {"username": "alice", "email": "alice@example.com"}. The second transform keeps only the 'username' field, removing 'email'. So the final value is {"username": "alice"}.
❓ Predict Output
intermediate2:00remaining
Result of converter chain with JSON and Avro
A Kafka Connect sink connector uses a converter chain where the key converter is JSON and the value converter is Avro. The source sends a record with key:
and value:
What is the expected format of the key and value when the sink receives the record?
{"id": 123}and value:
{"name": "Bob", "age": 30}What is the expected format of the key and value when the sink receives the record?
Attempts:
2 left
💡 Hint
Remember the converter chain applies separately to key and value.
✗ Incorrect
The key converter is JSON, so the key is serialized as JSON bytes. The value converter is Avro, so the value is serialized as Avro bytes.
🔧 Debug
advanced2:00remaining
Identify the error in transform chain configuration
A user configures a Kafka Connect transform chain as follows:
But the connector fails to start with an error about transform class not found. What is the most likely cause?
{"transforms": "maskField", "transforms.maskField.type": "org.apache.kafka.connect.transforms.MaskField$Value", "transforms.maskField.fields": "ssn"}But the connector fails to start with an error about transform class not found. What is the most likely cause?
Attempts:
2 left
💡 Hint
Check the exact class name spelling and casing for the transform.
✗ Incorrect
The class name is case-sensitive and must match exactly. 'MaskField$Value' is correct, but if the casing or spelling is wrong, the connector cannot find the class.
🧠 Conceptual
advanced2:00remaining
Understanding transform chain order impact
In Kafka Connect, you have two transforms in a chain:
1. ReplaceField to whitelist only 'user' and 'email'
2. MaskField to mask the 'email' field
If you reverse the order of these transforms, what is the impact on the final record?
1. ReplaceField to whitelist only 'user' and 'email'
2. MaskField to mask the 'email' field
If you reverse the order of these transforms, what is the impact on the final record?
Attempts:
2 left
💡 Hint
Think about which fields exist when each transform runs.
✗ Incorrect
If ReplaceField runs first and removes all fields except 'user' and 'email', then MaskField can mask 'email'. If MaskField runs first, it tries to mask 'email' before ReplaceField removes other fields. But if ReplaceField runs second and removes 'email', the mask has no effect. Reversing the order can cause MaskField to fail if the field is missing.
❓ Predict Output
expert3:00remaining
Output of complex converter and transform chain
A Kafka Connect source connector uses the following configuration:
Given a source record value:
What will be the value passed to the sink after the transform chain?
{
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter.schemas.enable": false,
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081",
"transforms": "unwrap,dropFields",
"transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",
"transforms.dropFields.type": "org.apache.kafka.connect.transforms.ReplaceField$Value",
"transforms.dropFields.blacklist": "metadata,internal"
}Given a source record value:
{"before": null, "after": {"id": 1, "name": "John", "metadata": "info", "internal": "secret"}}What will be the value passed to the sink after the transform chain?
Attempts:
2 left
💡 Hint
The unwrap transform extracts the 'after' field, then dropFields removes blacklisted fields.
✗ Incorrect
The unwrap transform extracts the 'after' object, so the record becomes {"id": 1, "name": "John", "metadata": "info", "internal": "secret"}. Then the dropFields transform removes 'metadata' and 'internal', leaving {"id": 1, "name": "John"}.