Process Flow - Common connectors (JDBC, S3, Elasticsearch)
Kafka Source Topic
Connector Configuration
Connector Plugin
JDBC
Data Sink
Data flows from Kafka through a connector configured to send data to JDBC, S3, or Elasticsearch sinks.
name=jdbc-sink-connector connector.class=io.confluent.connect.jdbc.JdbcSinkConnector topics=my_topic connection.url=jdbc:postgresql://localhost:5432/mydb insert.mode=insert
| Step | Action | Connector | Input | Output | Result |
|---|---|---|---|---|---|
| 1 | Read message from Kafka topic | All | Kafka message | Message data | Message ready for processing |
| 2 | Apply connector config | JDBC | Message data | SQL Insert statement | Prepared SQL for DB |
| 3 | Send data to sink | JDBC | SQL Insert statement | Database row | Data stored in DB |
| 4 | Read message from Kafka topic | S3 | Kafka message | Message data | Message ready for processing |
| 5 | Apply connector config | S3 | Message data | File object | Data formatted for S3 |
| 6 | Send data to sink | S3 | File object | S3 bucket file | Data stored in S3 |
| 7 | Read message from Kafka topic | Elasticsearch | Kafka message | Message data | Message ready for processing |
| 8 | Apply connector config | Elasticsearch | Message data | JSON document | Data formatted for ES |
| 9 | Send data to sink | Elasticsearch | JSON document | ES index document | Data indexed in ES |
| 10 | No more messages | All | - | - | Connector idle, waiting for new messages |
| Variable | Start | After Step 1 | After Step 2 | After Step 3 | After Step 4 | After Step 5 | After Step 6 | After Step 7 | After Step 8 | After Step 9 | Final |
|---|---|---|---|---|---|---|---|---|---|---|---|
| Message | None | Kafka message | Message data | SQL Insert statement | None | Kafka message | File object | None | Kafka message | JSON document | None |
| Data Sink | Empty | Empty | Empty | Database row | Empty | Empty | S3 bucket file | Empty | Empty | ES index document | Idle |
Common Kafka connectors move data from Kafka topics to external systems. Use connector.class to select sink: JDBC for databases, S3 for storage, Elasticsearch for search. Connectors read messages, transform data, then write to the sink. Each connector runs independently but follows the same flow: read, transform, write. Configure connectors with properties like topics, connection URLs, and modes.