0
0
Kafkadevops~20 mins

Why connectors integrate external systems in Kafka - Challenge Your Understanding

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Kafka Connector Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Purpose of Kafka Connectors
Why do Kafka connectors integrate external systems with Kafka?
ATo monitor Kafka cluster health continuously
BTo enable seamless data flow between Kafka and other systems without custom coding
CTo encrypt Kafka messages automatically
DTo replace Kafka brokers with external databases
Attempts:
2 left
💡 Hint
Think about how connectors help move data easily.
🧠 Conceptual
intermediate
2:00remaining
Role of Source Connectors
What is the main role of a Kafka source connector?
ATo receive data from external systems into Kafka topics
BTo manage Kafka cluster configurations
CTo send data from Kafka to external systems
DTo compress Kafka messages
Attempts:
2 left
💡 Hint
Source means where data comes from.
Predict Output
advanced
2:30remaining
Output of Kafka Connect Configuration
Given this Kafka Connect configuration snippet, what is the expected behavior?
Kafka
name=jdbc-source-connector
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
topic.prefix=jdbc_
connection.url=jdbc:postgresql://localhost:5432/mydb
mode=incrementing
incrementing.column.name=id
AKafka Connect will fail due to missing sink configuration
BKafka will send data to PostgreSQL database continuously
CData from PostgreSQL table will be streamed into Kafka topics prefixed with 'jdbc_' incrementally
DData will be deleted from PostgreSQL after streaming
Attempts:
2 left
💡 Hint
Look at the connector class and mode.
Predict Output
advanced
2:30remaining
Result of Sink Connector Behavior
What happens when a Kafka sink connector is configured to write to an external system?
Kafka
name=s3-sink-connector
connector.class=io.confluent.connect.s3.S3SinkConnector
topics=my_topic
s3.bucket.name=my-bucket
flush.size=1000
AMessages are deleted from Kafka after 1000 records
BData is read from S3 and sent to Kafka topic 'my_topic'
CKafka Connect will produce an error because flush.size is invalid
DMessages from 'my_topic' are batched and saved to S3 bucket 'my-bucket' every 1000 records
Attempts:
2 left
💡 Hint
Sink connectors write data out of Kafka.
🧠 Conceptual
expert
3:00remaining
Why Use Kafka Connect Instead of Custom Code?
What is the main advantage of using Kafka Connect to integrate external systems instead of writing custom integration code?
AKafka Connect provides a scalable, fault-tolerant, and reusable framework that reduces development effort
BCustom code always runs faster than Kafka Connect
CKafka Connect requires no configuration and works automatically
DCustom code is easier to maintain than Kafka Connect connectors
Attempts:
2 left
💡 Hint
Think about maintenance and scalability benefits.