Get Started Free
course: Schema Registry 101

Hands On: Evolve Existing Schemas

1 min
Danica Fine

Danica Fine

Senior Developer Advocate (Presenter)

In this exercise you will evolve (update) the schemas you created earlier. You will also test them for compatibility and update your clients to work with the new schema.

  1. To get started, navigate to the Topics view in the Confluent Cloud Console.

Update the Protobuf schema first.

  1. In the proto-purchase row, click the green check mark in the Schema column.


  1. Click on the Raw schema button to expand the text view of the schema.


Now add a new field that records the id of the employee involved in the purchase.

  1. Click on the Evolve Schema button on the right side of the Console.

This will open an Edit schema view where you can make changes.

  1. Start a new line after the customer_id field.

  2. Create a new field employee_id by entering the following:

    string employee_id = 4;
  3. Before saving the schema, test it by clicking on the Validate button.

You should see a Validation passed message.

  1. Click on the Save button on the bottom right.

After the save completes, the Console returns to the Schema view. Note the new version of 2.

  1. Now click on the Raw schema button again to confirm the new field is present.

Now you need to get the updated schema into your project. While you could make the same changes locally, it’s better to download the exact same schema file. That way there is a much smaller chance for errors with the new schema.

  1. Open the build.gradle file and locate the DOWNLOAD block.


Notice the second line in the block contains information used to download the file. The first entry is the subject name of the schema to download. The second entry is where to put the downloaded file on your local file system. The third entry is the name of the file. Let’s download the new schema version now.

  1. In a terminal window at the root of the schema-registry project, run the command ./gradlew downloadSchemasTask.

You should see a build successful message in the console.

  1. Open the purchase.proto file and verify it was updated and includes the new field.

Now let’s make the same update to the Avro schema but this time we will update it locally and then register the updated schema in Schema Registry.

  1. Open up the Avro schema file purchase.avsc.

  2. To add the new employee_id field, add the following lines after the customer_id field after first adding a , after the end of its definition.

        {"name": "employee_id", "type": "string"}


Now verify the new schema version is compatible with the previous version.

  1. In a terminal window at the root of the schema-registry project, run the command ./gradlew testSchemasTask.

You should see a BUILD FAILED message. You are using backward compatibility and it permits adding new fields that include default values but you didn’t provide one for the employee_id field. Let’s do so now.

  1. In the purchase.avsc, update the employee_id field as follows:

        {"name": "employee_id", "type": "string", "default": "unknown"}
  2. Return to the terminal window and run ./gradlew testSchemasTask again.

This time you should see a green BUILD SUCCESSFUL message.

Let’s go ahead and register the updated schema from the command line using Gradle.

  1. Run command ./gradlew registerSchemasTask.

When the command completes, you should see a BUILD SUCCESSFUL message.

Now let’s confirm the Avro schema changes in the Confluent Cloud Schema Registry.

  1. Return to the Confluent Cloud Console, navigate to Schema Registry, click the View & manage schemas button.
  2. In the Schemas view, click on the avro-purchase-value link in the Subject name column of the table.
  3. Confirm the version number is now 2.
  4. Click on the Raw schema button and confirm the new field and its default value are present.

Exercise Environment Teardown

After completing the course exercises, you need to tear down the exercise environment to avoid unnecessarily accruing cost to the point your promotional credits are exhausted. Let’s now do so using the Confluent CLI.

First you will need to log in using your Confluent Cloud user and password.

  1. Run command confluent login --save and provide your credentials when prompted.

Now let’s set the context for the CLI commands that will be used for the tear down.

  1. Run command:

    confluent environment list

Set the CLI context to the environment you used for the exercises.

  1. Run command:

    confluent environment use <exercise env ID>

Now let’s get the cluster ID for the schema-registry-101 cluster..

  1. Run command:

    confluent kafka cluster list

Now you can delete the cluster using this cluster ID.

  1. Run command:

    confluent kafka cluster delete <schema-registry-101 cluster ID>

And finally, you can delete the Schema Registry cluster associated with the Confluent Cloud environment that was used for the exercise.


Do not delete the Schema Registry cluster unless you are sure it is not being used for anything other than this course.

  1. Run command:

    confluent schema-registry cluster delete

This concludes the course.

Use the promo code SCHEMA101 to get $25 of free Confluent Cloud usage

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.