This tutorial demonstrates how to build Kafka producer and consumer applications in C# that use Schema Registry for message schema management. You'll learn how to configure your .NET applications to serialize and deserialize records, ensuring type safety and schema evolution compatibility. By the end of this tutorial, you'll have working applications that produce and consume device temperature reading records.
The applications in this tutorial use Avro-formatted messages. To use Protobuf or JSON Schema formatting, you would need to use a different serializer / deserializer (Protobuf, JSON Schema), but otherwise the applications would be similarly structured.
The following steps use Confluent Cloud. To run the tutorial locally with Docker, skip to the Docker instructions section at the bottom.
git clone git@github.com:confluentinc/tutorials.git
cd tutorialsLog in to your Confluent Cloud account:
confluent login --prompt --saveInstall a CLI plugin that will streamline the creation of resources in Confluent Cloud:
confluent plugin install confluent-quickstartRun the plugin from the top-level directory of the tutorials repository to create the Confluent Cloud resources needed for this tutorial. Note that you may specify a different cloud provider (gcp or azure) or region. You can find supported regions in a given cloud provider by running confluent kafka region list --cloud <CLOUD>.
confluent quickstart \
--environment-name kafka-sr-env \
--kafka-cluster-name kafka-sr-cluster \
--create-kafka-key \
--kafka-librdkafka-properties-file ./schema-registry-dotnet/config/cloud-kafka.properties \
--create-sr-key \
--schema-registry-properties-file ./schema-registry-dotnet/config/cloud-sr.propertiesThe plugin should complete in under a minute.
Create the topic for the application:
confluent kafka topic create readingsNavigate into the application's source code directory:
cd schema-registry-dotnet/srcThere are three projects included in the example:
Dependencies (AvroProducer.csproj):
AvroProducer.cs demonstrates how to produce Avro-serialized messages to Kafka with Schema Registry integration:
Dependencies (AvroConsumer.csproj):
AvroConsumer.cs demonstrates how to consume Avro-serialized messages from Kafka with Schema Registry integration:
The Common project is a shared library containing utilities and models used by both the producer and consumer applications. It includes configuration loaders (Properties.cs), command-line parsing (CommandLineOptions.cs), and the auto-generated Avro class TempReading representing temperature readings with deviceId and temperature fields.
Build the producer and consumer applications by running the following command from the schema-registry-dotnet directory:
dotnet build schema-registry-dotnet.slnRun the producer application, passing the Kafka and Schema Registry client configuration files generated when you created Confluent Cloud resources:
dotnet run \
--project src/AvroProducer/AvroProducer.csproj -- \
--kafka-properties-file config/cloud-kafka.properties \
--sr-properties-file config/cloud-sr.propertiesYou will see that ten readings produced to Kafka are logged to the console like this:
10 messages produced to topic readingsRun the consumer application, passing the Kafka and Schema Registry client configuration files generated when you created Confluent Cloud resources:
dotnet run \
--project src/AvroConsumer/AvroConsumer.csproj -- \
--kafka-properties-file config/cloud-kafka.properties \
--sr-properties-file config/cloud-sr.propertiesYou will see output like the following:
Consumed reading deviceId: 1, temperature: 66.65975
Consumed reading deviceId: 3, temperature: 57.096996
Consumed reading deviceId: 3, temperature: 51.907024
Consumed reading deviceId: 2, temperature: 87.31283
Consumed reading deviceId: 3, temperature: 67.84723
Consumed reading deviceId: 2, temperature: 71.68407
Consumed reading deviceId: 3, temperature: 77.41191
Consumed reading deviceId: 2, temperature: 78.22028
Consumed reading deviceId: 3, temperature: 78.91205
Consumed reading deviceId: 3, temperature: 58.753788When you're finished, delete the kafka-sr-env environment. First, get its environment ID (of the form env-123456):
confluent environment listDelete the environment, including all resources created for this tutorial:
confluent environment delete <ENVIRONMENT ID>git clone git@github.com:confluentinc/tutorials.git
cd tutorialsStart Kafka and Schema Registry with the following command from the top-level tutorials repository:
docker compose -f ./docker/docker-compose-kafka-sr.yml up -dOpen a shell in the broker container:
docker exec -it broker /bin/bashCreate the topic for the application:
kafka-topics --bootstrap-server localhost:9092 --create --topic readingsNavigate into the application's source code directory:
cd schema-registry-dotnet/srcThere are three projects included in the example:
Dependencies (AvroProducer.csproj):
AvroProducer.cs demonstrates how to produce Avro-serialized messages to Kafka with Schema Registry integration:
Dependencies (AvroConsumer.csproj):
AvroConsumer.cs demonstrates how to consume Avro-serialized messages from Kafka with Schema Registry integration:
The Common project is a shared library containing utilities and models used by both the producer and consumer applications. It includes configuration loaders (Properties.cs), command-line parsing (CommandLineOptions.cs), and the auto-generated Avro class TempReading representing temperature readings with deviceId and temperature fields.
Build the producer and consumer applications by running the following command from the schema-registry-dotnet directory:
dotnet build schema-registry-dotnet.slnRun the producer application, passing the client configuration files pointing to Kafka and Schema Registry running in Docker.
dotnet run \
--project src/AvroProducer/AvroProducer.csproj -- \
--kafka-properties-file config/local-kafka.properties \
--sr-properties-file config/local-sr.propertiesYou will see that ten readings produced to Kafka are logged to the console like this:
10 messages produced to topic readingsNow run the consumer application:
dotnet run \
--project src/AvroConsumer/AvroConsumer.csproj -- \
--kafka-properties-file config/local-kafka.properties \
--sr-properties-file config/local-sr.propertiesYou will see output like the following:
Consumed reading deviceId: 3, temperature: 81.7913
Consumed reading deviceId: 4, temperature: 79.99652
Consumed reading deviceId: 4, temperature: 64.110596
Consumed reading deviceId: 4, temperature: 89.80548
Consumed reading deviceId: 1, temperature: 67.72774
Consumed reading deviceId: 2, temperature: 88.04513
Consumed reading deviceId: 1, temperature: 98.9442
Consumed reading deviceId: 3, temperature: 57.281647
Consumed reading deviceId: 1, temperature: 56.34163
Consumed reading deviceId: 3, temperature: 54.051064From your local machine, stop the broker and Schema Registry containers:
docker compose -f ./docker/docker-compose-kafka-sr.yml down