Get Started Free
course: Spring Framework and Apache Kafka®

Confluent Cloud Schema Registry and Spring Boot

5 min
Viktor Gamov

Viktor Gamov

Developer Advocate (Presenter)

Confluent Cloud Schema Registry and Spring Boot

Confluent Cloud Schema Registry is fully managed and works easily: When you create a Confluent Cloud cluster, you have the ability to set up Confluent Cloud Schema Registry, and the only thing you need to do is specify where you want to create it. It supports Avro, Protobuf, and JSON Schema formats for producers, consumers, and Kafka Streams. You can also put multiple schemas in one application. Confluent provides implementations of serializers and deserializers for all of the formats, but you do need to put the libraries in your classpath.

To configure a Spring Boot application for use with Confluent Cloud Schema Registry, you need to adjust

# Confluent Cloud Schema Registry{{ SR_API_KEY }}:{{ SR_API_SECRET }}



You’ll need an API key, a password, and your Confluent Cloud Schema Registry URL from Confluent Cloud, and you’ll need to specify the various serializers that you are using. Also, as you can see, some of the serializers/deserializers come from the io.confluent.kafka package, so you will need to add a JAR to your local application to use them. You can do so in build.gradle:

repositories {
    maven {
        url ""

Then you need to add implementations for the serializers/deserializers that you are using, for example:

implementation 'org.apache.avro:avro:1.10.2'
implementation 'io.confluent:kafka-avro-serializer:6.1.0'

Use the promo code SPRING101 to get $25 of free Confluent Cloud usage

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.