Senior Developer Advocate (Presenter)
In this exercise you will evolve (update) the schemas you created earlier. You will also test them for compatibility and update your clients to work with the new schema.
Update the Protobuf schema first.
Now add a new field that records the id of the employee involved in the purchase.
This will open an Edit schema view where you can make changes.
Start a new line after the customer_id field.
Create a new field employee_id by entering the following:
string employee_id = 4;
Before saving the schema, test it by clicking on the Validate button.
You should see a Validation passed message.
After the save completes, the Console returns to the Schema view. Note the new version of 2.
Now you need to get the updated schema into your project. While you could make the same changes locally, it’s better to download the exact same schema file. That way there is a much smaller chance for errors with the new schema.
Notice the second line in the block contains information used to download the file. The first entry is the subject name of the schema to download. The second entry is where to put the downloaded file on your local file system. The third entry is the name of the file. Let’s download the new schema version now.
You should see a build successful message in the console.
Now let’s make the same update to the Avro schema but this time we will update it locally and then register the updated schema in Schema Registry.
Open up the Avro schema file purchase.avsc.
To add the new employee_id field, add the following lines after the customer_id field after first adding a , after the end of its definition.
{"name": "employee_id", "type": "string"}
Now verify the new schema version is compatible with the previous version.
You should see a BUILD FAILED message. You are using backward compatibility and it permits adding new fields that include default values but you didn’t provide one for the employee_id field. Let’s do so now.
In the purchase.avsc, update the employee_id field as follows:
{"name": "employee_id", "type": "string", "default": "unknown"}
Return to the terminal window and run ./gradlew testSchemasTask again.
This time you should see a green BUILD SUCCESSFUL message.
Let’s go ahead and register the updated schema from the command line using Gradle.
When the command completes, you should see a BUILD SUCCESSFUL message.
Now let’s confirm the Avro schema changes in the Confluent Cloud Schema Registry.
Exercise Environment Teardown
After completing the course exercises, you need to tear down the exercise environment to avoid unnecessarily accruing cost to the point your promotional credits are exhausted. Let’s now do so using the Confluent CLI.
First you will need to log in using your Confluent Cloud user and password.
Now let’s set the context for the CLI commands that will be used for the tear down.
Run command:
confluent environment list
Set the CLI context to the environment you used for the exercises.
Run command:
confluent environment use <exercise env ID>
Now let’s get the cluster ID for the schema-registry-101 cluster..
Run command:
confluent kafka cluster list
Now you can delete the cluster using this cluster ID.
Run command:
confluent kafka cluster delete <schema-registry-101 cluster ID>
And finally, you can delete the Schema Registry cluster associated with the Confluent Cloud environment that was used for the exercise.
Do not delete the Schema Registry cluster unless you are sure it is not being used for anything other than this course.
Run command:
confluent schema-registry cluster delete
This concludes the course.
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.
Hi, I'm Danica Fine. In this final Schema Registry exercise, you'll see how to evolve the Protobuf and Avro schemas that you created in previous exercises. First, you'll evolve the Protobuf schema using the Confluent Cloud UI by adding a new field, validating the updated schema, and saving the new schema version. Next, you'll see how to evolve the Avro schema using Gradle. Initially, you'll add a new field without a default value, and see how the compatibility validation check fails. From there, you'll correctly add a default value for the new field so that the validation check succeeds. And finally, you'll register the new schema version using Gradle and confirm in the Confluent Cloud UI that the registration succeeded. And when you're done, take a moment to feel proud of all that you learned throughout this course and the hard work you put in. I hope you join me again next time for another course.