0
0
Kafkadevops~5 mins

Serialization (String, JSON, Avro) in Kafka - Commands & Configuration

Choose your learning style9 modes available
Introduction
Serialization converts data into a format that Kafka can send and receive. It solves the problem of sharing data between different systems by making data easy to transport and understand.
When you want to send simple text messages between services using Kafka.
When you need to send structured data like user info in JSON format through Kafka topics.
When you want to ensure data is compact and schema-validated using Avro serialization.
When different applications need to read the same Kafka messages with a shared schema.
When you want to reduce errors by enforcing data format rules before sending messages.
Config File - producer.properties
producer.properties
bootstrap.servers=localhost:9092
key.serializer=org.apache.kafka.common.serialization.StringSerializer
value.serializer=io.confluent.kafka.serializers.KafkaAvroSerializer
schema.registry.url=http://localhost:8081

This configuration file sets up a Kafka producer to connect to the Kafka server at localhost:9092.

The key.serializer is set to StringSerializer to send keys as plain text.

The value.serializer is set to KafkaAvroSerializer to send values in Avro format.

The schema.registry.url points to the schema registry service that manages Avro schemas.

Commands
Starts a Kafka console producer to send messages to 'example-topic'. It allows typing messages with keys separated by ':'.
Terminal
kafka-console-producer --broker-list localhost:9092 --topic example-topic --property parse.key=true --property key.separator=:
Expected OutputExpected
No output (command runs silently)
--broker-list - Specifies Kafka server address
--topic - Specifies the topic to send messages to
--property parse.key=true - Enables key parsing from input
--property key.separator=: - Defines ':' as the separator between key and value
Sends a JSON string message to 'json-topic' using StringSerializer for the value.
Terminal
echo '{"name":"Alice","age":30}' | kafka-console-producer --broker-list localhost:9092 --topic json-topic --property value.serializer=org.apache.kafka.common.serialization.StringSerializer
Expected OutputExpected
No output (command runs silently)
--broker-list - Specifies Kafka server address
--topic - Specifies the topic to send messages to
--property value.serializer=org.apache.kafka.common.serialization.StringSerializer - Sends message value as plain string
Starts an Avro console producer to send messages with a defined Avro schema to 'avro-topic'.
Terminal
kafka-avro-console-producer --broker-list localhost:9092 --topic avro-topic --property schema.registry.url=http://localhost:8081 --property value.schema='{"type":"record","name":"User","fields":[{"name":"name","type":"string"},{"name":"age","type":"int"}]}'
Expected OutputExpected
No output (command runs silently)
--broker-list - Specifies Kafka server address
--topic - Specifies the topic to send messages to
--property schema.registry.url - Points to the schema registry service
--property value.schema - Defines the Avro schema for the message value
Starts a Kafka console consumer to read messages from 'example-topic' from the beginning, showing keys and values separated by ':'.
Terminal
kafka-console-consumer --bootstrap-server localhost:9092 --topic example-topic --from-beginning --property print.key=true --property key.separator=:
Expected OutputExpected
myKey:myValue user1:hello
--bootstrap-server - Specifies Kafka server address
--topic - Specifies the topic to read messages from
--from-beginning - Reads all messages from the start
--property print.key=true - Shows message keys in output
--property key.separator=: - Separates key and value in output
Key Concept

If you remember nothing else from this pattern, remember: serialization turns data into a format Kafka can send and receive, and choosing the right serializer ensures your data is understood correctly.

Common Mistakes
Using StringSerializer for complex structured data without converting it to JSON or Avro.
Kafka will treat the data as plain text, losing structure and causing errors in consumers expecting structured data.
Use JSON or Avro serializers for structured data and ensure consumers use matching deserializers.
Not setting the schema.registry.url when using Avro serialization.
The producer cannot register or retrieve schemas, causing message send failures.
Always configure schema.registry.url to point to your running schema registry service.
Sending messages without keys when keys are expected for partitioning or message ordering.
Kafka may distribute messages unevenly or lose ordering guarantees.
Include keys in messages and configure the producer to parse and send keys properly.
Summary
Configure Kafka producers with appropriate serializers for your data format: String, JSON, or Avro.
Use console producer commands to send messages with keys and values to Kafka topics.
Use console consumer commands to read messages, showing keys and values for verification.
Avro serialization requires a schema registry to manage and validate data schemas.