Get Started Free
‹ Back to courses
course: Spring Framework and Apache Kafka®

Confluent Cloud Schema Registry and Spring Boot

5 min
Viktor Gamov

Viktor Gamov

Developer Advocate (Presenter)

Confluent Cloud Schema Registry and Spring Boot

Confluent Cloud Schema Registry is fully managed and works easily: When you create a Confluent Cloud cluster, you have the ability to set up Confluent Cloud Schema Registry, and the only thing you need to do is specify where you want to create it. It supports Avro, Protobuf, and JSON Schema formats for producers, consumers, and Kafka Streams. You can also put multiple schemas in one application. Confluent provides implementations of serializers and deserializers for all of the formats, but you do need to put the libraries in your classpath.

To configure a Spring Boot application for use with Confluent Cloud Schema Registry, you need to adjust application.properties:

# Confluent Cloud Schema Registry
spring.kafka.properties.basic.auth.credentials.source=USER_INFO
spring.kafka.properties.basic.auth.user.info={{ SR_API_KEY }}:{{ SR_API_SECRET }}
spring.kafka.properties.schema.registry.url=https://sr.confluent.cloud

spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.IntegerSerializer
spring.kafka.producer.value-serializer=io.confluent.kafka.serializers.KafkaAvroSerializer

spring.kafka.consumer.value-deserializer=io.confluent.kafka.serializers.KafkaAvroDeserializer
spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.IntegerDeserializer

You’ll need an API key, a password, and your Confluent Cloud Schema Registry URL from Confluent Cloud, and you’ll need to specify the various serializers that you are using. Also, as you can see, some of the serializers/deserializers come from the io.confluent.kafka package, so you will need to add a JAR to your local application to use them. You can do so in build.gradle:

repositories {
    mavenCentral()
    maven {
        url "https://packages.confluent.io/maven"
    }
}

Then you need to add implementations for the serializers/deserializers that you are using, for example:

implementation 'org.apache.avro:avro:1.10.2'
implementation 'io.confluent:kafka-avro-serializer:6.1.0'

Use the promo code SPRING101 & CONFLUENTDEV1 to get $25 of free Confluent Cloud usage and skip credit card entry.

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.

Confluent Cloud Schema Registry and Spring Boot

Hi, I'm Victor Gamov of Confluent. And in this module, I'm going to talk about how to use Cloud Schema Registry in your Spring Boot application. Let's get to it. All right, in this section that we're going to talk about a fully managed the schema registry this information pretty much, it's nothing new. If you haven't heard about schema registry before, you can find some of the videos from developer.confluent.io. That way you can learn a little bit more about schema registry about schemas and how it works. So this module will solely focus on explaining some of the things that would be relevant to fully managed schema registry. So as you will expect from the fully managed the fully managed service, it just works. So when you create a cluster you have a ability to choose to go ahead and say I want to create a schema registry. And only things that you need to do is just like select where you want to create this. We will walk through this thing how we can do this in the exercise section. So after this, you will be able to do it yourself. A schema registry, fully managed scheme registry also supports Avro supports protobuf and JSON schema, meaning that you can define your schema in those formats. And this information will be stored inside a schema registry. And in one application you actually can use multiple schemas. And if you would be interested to do this, there's a Kafka tutorial that shows you how we can change serialization formats in your application. So as confluent provides implementation of of serializers and deserializers for all these formats for Avro, for protobuf, for JSON schema. So you need to put these libraries in the classpath. So next, let's take a look at what it takes us to configure our Spring Boot application. So in our Spring Boot applications we need to provide some generic properties for application. So when we will instantiate a schema registry serdes, those properties will be injected. And those properties is important because in a fully managed, schema registry, you need to get API key and a password for this key like secret that allows you to connect to this one. So these three things will be important. First of all it's gonna be our schema registry URL of configuration, which will include, username and password and some additional kind of parameters that will tell the client, the rest client of how to you know, perform this notification. To use those things, to use serializers in your application code, you need to tell the Spring how your value would be serialized. If it's a producer or have this value would be deserialized if it's a consumer. So in this case, as you can see here they comes from the IO confluent Kafka package. So those are coming from an extra jar. How we can get this jar you can ask. So essentially you need to define a extra repository in your build file. Where is the confluent publishes all this libraries. And for Avro, for example you need to define this Kafka Avro serializers. If you're using like protobuf, you need to define Kafka protobuf serializer or if using JSON schema you need to define serializer for JSON schema. And after that, you should be able to use all this configuration in your applications. Next, you will learn and I will walk you through the process how you can enable Avro based applications. We're going to modify our existing producer and consumer to produce Avro based in the binary serialized the messaging using Avro, and how you can consume this data. All right. And don't hesitate and jump right into this exercise section. Good luck.