This tutorial demonstrates how to build Kafka producer and consumer applications in JavaScript that use Schema Registry for message schema management. You'll learn how to configure your JavaScript applications to serialize and deserialize records, ensuring type safety and schema evolution compatibility. By the end of this tutorial, you'll have working applications that produce and consume device temperature reading records.
The applications in this tutorial use Avro-formatted messages. In order to use Protobuf or JSON Schema formatting, you would need to use a different serializer / deserializer, but otherwise the applications would be similarly structured.
The following steps use Confluent Cloud. To run the tutorial locally with Docker, skip to the Docker instructions section at the bottom.
git clone git@github.com:confluentinc/tutorials.git
cd tutorialsLog in to your Confluent Cloud account:
confluent login --prompt --saveInstall a CLI plugin that will streamline the creation of resources in Confluent Cloud:
confluent plugin install confluent-quickstartRun the plugin from the top-level directory of the tutorials repository to create the Confluent Cloud resources needed for this tutorial. Note that you may specify a different cloud provider (gcp or azure) or region. You can find supported regions in a given cloud provider by running confluent kafka region list --cloud <CLOUD>.
confluent quickstart \
--environment-name kafka-sr-env \
--kafka-cluster-name kafka-sr-cluster \
--create-kafka-key \
--kafka-librdkafka-properties-file ./schema-registry-js/config/cloud-kafka.properties \
--create-sr-key \
--schema-registry-properties-file ./schema-registry-js/config/cloud-sr.propertiesThe plugin should complete in under a minute.
Create the topic for the application:
confluent kafka topic create readingsNavigate into the application's source code directory:
cd schema-registry-js/srcThe application consists of three JavaScript files:
avro_producer.js - This file demonstrates how to produce Avro-formatted messages to Kafka using Schema Registry:
avro_consumer.js - This file demonstrates how to consume Avro-formatted messages from Kafka using Schema Registry:
utils.js - This file contains shared utility functions:
Install the @confluentinc/kafka-javascript and @confluentinc/schemaregistry dependencies by running the following command in the schema-registry-js/src directory:
npm installRun the producer application, passing the Kafka and Schema Registry client configuration files generated when you created Confluent Cloud resources:
node avro_producer.js \
--kafka-properties-file ../config/cloud-kafka.properties \
--sr-properties-file ../config/cloud-sr.propertiesYou will see the following message as well as the logging that occurs when the producer connects and disconnects.
Produced 10 readingsRun the consumer application, passing in the same configuration used when running the producer:
node avro_consumer.js \
--kafka-properties-file ../config/cloud-kafka.properties \
--sr-properties-file ../config/cloud-sr.propertiesYou will see message values like the following, along with logging when the client connects and disconnects:
TempReading { deviceId: '2', temperature: 78.34902954101562 }
TempReading { deviceId: '3', temperature: 91.36271667480469 }
TempReading { deviceId: '4', temperature: 73.98355865478516 }
TempReading { deviceId: '3', temperature: 54.87724685668945 }
TempReading { deviceId: '2', temperature: 83.80644989013672 }
TempReading { deviceId: '2', temperature: 52.60075378417969 }
TempReading { deviceId: '2', temperature: 95.52684783935547 }
TempReading { deviceId: '2', temperature: 75.393798828125 }
TempReading { deviceId: '1', temperature: 79.79203796386719 }
TempReading { deviceId: '4', temperature: 96.6504135131836 }When you're finished, delete the kafka-sr-env environment. First, get its environment ID (of the form env-123456):
confluent environment listDelete the environment, including all resources created for this tutorial:
confluent environment delete <ENVIRONMENT ID>git clone git@github.com:confluentinc/tutorials.git
cd tutorialsStart Kafka and Schema Registry with the following command from the top-level tutorials repository:
docker compose -f ./docker/docker-compose-kafka-sr.yml up -dOpen a shell in the broker container:
docker exec -it broker /bin/bashCreate the topic for the application:
kafka-topics --bootstrap-server localhost:9092 --create --topic readingsNavigate into the application's source code directory:
cd schema-registry-js/srcThe application consists of three JavaScript files:
avro_producer.js - This file demonstrates how to produce Avro-formatted messages to Kafka using Schema Registry:
avro_consumer.js - This file demonstrates how to consume Avro-formatted messages from Kafka using Schema Registry:
utils.js - This file contains shared utility functions:
Install the @confluentinc/kafka-javascript and @confluentinc/schemaregistry dependencies by running the following command in the schema-registry-js/src directory:
npm installRun the producer application, passing the Kafka and Schema Registry client configuration files for connecting to Kafka and Schema Registry running in Docker:
node avro_producer.js \
--kafka-properties-file ../config/local-kafka.properties \
--sr-properties-file ../config/local-sr.propertiesYou will see the following message as well as the logging that occurs when the producer connects and disconnects.
Produced 10 readingsNow run the consumer application, passing in the same configuration used when running the producer:
node avro_consumer.js \
--kafka-properties-file ../config/local-kafka.properties \
--sr-properties-file ../config/local-sr.propertiesYou will see message values like the following, along with logging when the client connects and disconnects:
TempReading { deviceId: '2', temperature: 78.34902954101562 }
TempReading { deviceId: '3', temperature: 91.36271667480469 }
TempReading { deviceId: '4', temperature: 73.98355865478516 }
TempReading { deviceId: '3', temperature: 54.87724685668945 }
TempReading { deviceId: '2', temperature: 83.80644989013672 }
TempReading { deviceId: '2', temperature: 52.60075378417969 }
TempReading { deviceId: '2', temperature: 95.52684783935547 }
TempReading { deviceId: '2', temperature: 75.393798828125 }
TempReading { deviceId: '1', temperature: 79.79203796386719 }
TempReading { deviceId: '4', temperature: 96.6504135131836 }From your local machine, stop the broker and Schema Registry containers:
docker compose -f ./docker/docker-compose-kafka-sr.yml down