This tutorial demonstrates how to build Kafka producer and consumer applications in Go that use Schema Registry for message schema management. You'll learn how to configure your Go applications to serialize and deserialize records, ensuring type safety and schema evolution compatibility. By the end of this tutorial, you'll have working applications that produce and consume device temperature reading records.
The applications in this tutorial use Avro-formatted messages. In order to use Protobuf or JSON Schema formatting, you would need to use a different serializer / deserializer (Protobuf, JSON Schema), but otherwise the applications would be similarly structured.
The following steps use Confluent Cloud. To run the tutorial locally with Docker, skip to the Docker instructions section at the bottom.
git clone git@github.com:confluentinc/tutorials.git
cd tutorialsLog in to your Confluent Cloud account:
confluent login --prompt --saveInstall a CLI plugin that will streamline the creation of resources in Confluent Cloud:
confluent plugin install confluent-quickstartRun the plugin from the top-level directory of the tutorials repository to create the Confluent Cloud resources needed for this tutorial. Note that you may specify a different cloud provider (gcp or azure) or region. You can find supported regions in a given cloud provider by running confluent kafka region list --cloud <CLOUD>.
confluent quickstart \
--environment-name kafka-sr-env \
--kafka-cluster-name kafka-sr-cluster \
--create-kafka-key \
--kafka-librdkafka-properties-file ./schema-registry-go/config/cloud-kafka.properties \
--create-sr-key \
--schema-registry-properties-file ./schema-registry-go/config/cloud-sr.propertiesThe plugin should complete in under a minute.
Create the topic for the application:
confluent kafka topic create readingsNavigate into the application's source code directory:
cd schema-registry-go/srcThis folder contains the following source files:
avro_producer.go: Implements a Kafka producer that generates random Avro-formatted temperature readings and produces them to the readings topic.
avro_consumer.go: Implements a Kafka consumer that subscribes to the readings topic and continuously polls for new messages, deserializing them from Avro format and logging them to the console.
temp_reading.go: Defines the TempReading domain object struct with deviceId (string) and temperature (float32) fields, tagged for Avro serialization.
utils.go: Contains shared utility functions used by both producer and consumer: ParseConfigFlags() for parsing command-line flags, LoadKafkaConfig() for loading Kafka configuration from properties files, and CreateSchemaRegistryClient() for creating a Schema Registry client with optional basic authentication.
Compile the producer application from the schema-registry-go folder:
go build -o out/avro_producer \
src/temp_reading.go \
src/utils.go \
src/avro_producer.goRun the producer application, passing the Kafka and Schema Registry client configuration files generated when you created Confluent Cloud resources:
./out/avro_producer \
--kafka-properties-file ./config/cloud-kafka.properties \
--sr-properties-file ./config/cloud-sr.propertiesYou will see that ten readings produced to Kafka are logged to the console like this:
Delivered message to topic readings [0] at offset 0: DeviceId=4, Temperature=71.06Compile the consumer application:
go build -o out/avro_consumer \
src/temp_reading.go \
src/utils.go \
src/avro_consumer.goRun the consumer application, passing the Kafka and Schema Registry client configuration files generated when you created Confluent Cloud resources:
./out/avro_consumer \
--kafka-properties-file ./config/cloud-kafka.properties \
--sr-properties-file ./config/cloud-sr.propertiesYou will see output like the following prepended with timestamps:
{DeviceID:1 Temperature:62.20346}
{DeviceID:2 Temperature:88.26609}
{DeviceID:4 Temperature:51.299347}
{DeviceID:3 Temperature:99.0374}
{DeviceID:3 Temperature:90.82821}
{DeviceID:2 Temperature:54.175503}
{DeviceID:2 Temperature:67.71871}
{DeviceID:2 Temperature:73.04611}
{DeviceID:1 Temperature:98.31503}
{DeviceID:1 Temperature:65.337845}When you're finished, delete the kafka-sr-env environment. First, get its environment ID (of the form env-123456):
confluent environment listDelete the environment, including all resources created for this tutorial:
confluent environment delete <ENVIRONMENT ID>git clone git@github.com:confluentinc/tutorials.git
cd tutorialsStart Kafka and Schema Registry with the following command from the top-level tutorials repository:
docker compose -f ./docker/docker-compose-kafka-sr.yml up -dOpen a shell in the broker container:
docker exec -it broker /bin/bashCreate the topic for the application:
kafka-topics --bootstrap-server localhost:9092 --create --topic readingsNavigate into the application's source code directory:
cd schema-registry-go/srcThis folder contains the following source files:
avro_producer.go: Implements a Kafka producer that generates random Avro-formatted temperature readings and produces them to the readings topic.
avro_consumer.go: Implements a Kafka consumer that subscribes to the readings topic and continuously polls for new messages, deserializing them from Avro format and logging them to the console.
temp_reading.go: Defines the TempReading domain object struct with deviceId (string) and temperature (float32) fields, tagged for Avro serialization.
utils.go: Contains shared utility functions used by both producer and consumer: ParseConfigFlags() for parsing command-line flags, LoadKafkaConfig() for loading Kafka configuration from properties files, and CreateSchemaRegistryClient() for creating a Schema Registry client with optional basic authentication.
Compile the producer application from the schema-registry-go folder:
go build -o out/avro_producer \
src/temp_reading.go \
src/utils.go \
src/avro_producer.goRun the producer application, passing the Kafka and Schema Registry client configuration files generated when you created Confluent Cloud resources:
./out/avro_producer \
--kafka-properties-file ./config/local-kafka.properties \
--sr-properties-file ./config/local-sr.propertiesYou will see that ten readings produced to Kafka are logged to the console like this:
Delivered message to topic readings [0] at offset 0: DeviceId=4, Temperature=71.06Compile the consumer application:
go build -o out/avro_consumer \
src/temp_reading.go \
src/utils.go \
src/avro_consumer.goRun the consumer application, passing the Kafka and Schema Registry client configuration files generated when you created Confluent Cloud resources:
./out/avro_consumer \
--kafka-properties-file ./config/local-kafka.properties \
--sr-properties-file ./config/local-sr.propertiesYou will see output like the following prepended with timestamps:
{DeviceID:1 Temperature:62.20346}
{DeviceID:2 Temperature:88.26609}
{DeviceID:4 Temperature:51.299347}
{DeviceID:3 Temperature:99.0374}
{DeviceID:3 Temperature:90.82821}
{DeviceID:2 Temperature:54.175503}
{DeviceID:2 Temperature:67.71871}
{DeviceID:2 Temperature:73.04611}
{DeviceID:1 Temperature:98.31503}
{DeviceID:1 Temperature:65.337845}From your local machine, stop the broker and Schema Registry containers:
docker compose -f ./docker/docker-compose-kafka-sr.yml down