Get Started Free
course: Apache Kafka® 101

Hands On: Confluent Schema Registry

5 min
Danica Fine

Danica Fine

Senior Developer Advocate (Presenter)

Hands On: Confluent Schema Registry

In the previous exercise, we set up a source connector to generate sample data for us according to a predefined schema and formatted that data as JSON on the Apache Kafka topic. In this hands-on exercise, we'll work through a similar workflow as before, this time seeing how we can write and serialize the data using Avro and leverage Schema Registry to manage our schemas.

  1. From the Confluent Cloud Console, select Schema Registry from the lower left-hand corner.

  2. Continue by selecting Set up on my own. Then follow the prompts.

    set-up

  3. Once Schema Registry has been set up, from the Schema Registry landing page, scroll down to the “API credentials” section. To use Schema Registry from the command line later, you need to configure an API key and secret. Select the edit icon.

    schema-api-credentials

    Select Create key and follow the prompt. Store this API key and secret for use in a later step.

    create-api-key

  4. Navigate to the cluster overview page. Then, under “Data integration,” select Connectors.

  5. If you have no connectors, this page will take you directly to the connector search page, so you may skip to step 6. Otherwise, this page will display your existing connectors, so you’ll need to select Add connector to get to the connector search page.

    connector

  6. Once you’re on the connector search page, from the connectors search bar, enter “Datagen” and select the Datagen Source connector.

    datagen-source-connector

  7. Provide a topic for the connector to produce data into. You can do this directly as part of the connector creation process. Select Add a new topic. Call it orders, and Create with defaults. Select the new orders topic and Continue.

    add-datagen-connector

  8. Create a new API key for the connector to use for communicating with the Kafka cluster. Select Global Access and Generate API key & Download. Then Continue.

  9. The Datagen source connector can auto-generate a number of predefined datasets. Select orders and, this time, serialize the messages as Avro.

    serialize-messages

  10. Select Continue twice to first review the cost of the connector and then to give the connector a name and confirm the configuration choices.

  11. Click Launch to start the connector. It may take a few minutes to provision the connector.

  12. With the connector running, navigate to the Schema Registry page to have a look at the Avro schema that has been set for the orders topic. Select View & manage schemas.

    manage-schemas

  13. From there, select the orders-value schema and view its fields.

    orders-value

  14. From a terminal window, consume messages from the orders topic. If you’re curious, start by using the same consume command as in the previous exercise.

confluent kafka topic consume --from-beginning orders

You might have noticed that the data is more or less gibberish with a few recognizable strings. Because the data is serialized to the topic using Avro, before we can read it, the consumer needs to deserialize it. What we’re seeing here is Avro data that is read as if it’s just a regular string.

  1. To do this the right way, tell the consumer to fetch the Avro schema for this topic from Schema Registry and deserialize the data first.
confluent kafka topic consume --value-format avro --schema-registry-api-key {API Key} --schema-registry-api-secret {API Secret} orders

You’ll see the deserialized data being output.

Schemas are a great addition to your system if you’re looking to create a robust data pipeline and ensure data correctness across applications. Through this exercise, we saw a bit of this in action. We serialized data from a source connector as Avro, leveraged Schema Registry to store and manage that schema for us, and created a consumer that was able to consume and deserialize that data after connecting to Schema Registry.

Now you have all the tools that you need to start using Schema Registry in your own applications.

Use the promo codes KAFKA101 & CONFLUENTDEV1 to get $25 of free Confluent Cloud storage and skip credit card entry.

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.