Developer Advocate (Presenter)
Spring for Apache Kafka's KafkaTemplate is a thin wrapper around a Kafka producer that plays nicely with other Spring features, like dependency injection and automatic configuration. It provides a number of convenience methods for producing to Kafka topics. You can define a default topic in your configuration and always send messages there. Or, you can send them to a particular partition, a particular key, and so on. KafkaTemplate is a rather easy-to-use API, particularly if you already know how to use the Kafka producer API (see this Kafka Tutorial if you need a refresher on Kafka producers).
In order to create a new instance of the Kafka template, you need to create a ProducerFactory, which is responsible for instantiating the underlying Kafka producer (because Kafka producer is threadsafe, it can generate a singleton). A ProducerFactory requires a configuration, which is a simple map if you are using code-based configuration (you can also use automatic configuration based on property files).
@Bean
public ProducerFactory<String, String> producerFactory() {
return new DefaultKafkaProducerFactory<>(
Map.of(BOOTSTRAP_SERVERS_CONFIG, "localhost:9092",
RETRIES_CONFIG, 0,
BUFFER_MEMORY_CONFIG, 33554432,
KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class,
VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class
));
}
Once you have your ProducerFactory bean, you can create a KafkaTemplate.
@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
Kafka is an asynchronous system by default, so its send method always returns a Future. Spring provides its own capability for handling asynchronous communication, so methods in KafkaTemplates always return a ListenableFuture. You can also define a callback to introspect the results, or you can use ListenableFuture.get() to inspect results immediately.
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.
Hi, this is Victor Gamov from Confluent. And welcome back to this Spring Kafka for Confluent Cloud course. In this module, you will learn how Kafka Template API is working and what kind of things you need to know in order to instantiate this and start sending message to the Kafka topic. Let's get to it. KafkaTemplate is a class that Spring Kafka project provides to do certain things, specifically to send a message to Kafka Topic. Essentially, it's a very thin wrapper around Kafka Producer. This wrapper exists in order to play nicely with some other technologies that Spring has, including dependency injection and automatic configuration. So essentially thinking about this as a, kind of like a wrapper or adapter, so this component can be used in different, other different applications. So it provides a variety of convenience methods that allow you to send message to Kafka Topic. You can even define in your configuration some default topic and just always just use method that just simply sends the message. Or you want to use method that will send a message for a particular topic, particular partition with particular key and so far and so on. So it is really not very difficult to use API, especially if you already know how to use Kafka Producer. And if you need a refresher, how Producer works and what kind of Java APIs is required for you to use, you can always find those videos at developer.confluent.io or Confluent YouTube channel. So in order to create a new instance of our KafkaTemplate, first of all, we need to create a ProducerFactory. Essentially, the ProducerFactory will be responsible for instantiating underlying Kafka Producer. And since Kafka Producer is thread safe, so this factory can be, you know, can generate a singleton. One of the things that ProducerFactory requires is to you to provide configuration. And as you can see here, it's just a simple map that you can put in place. In this particular case, we using some of the configuration that is code-based. However, Spring Boot provides ability to use configuration and use automatic configuration based on the property files. You will see how you can do that in the next session where I will walk you through the call, how does it work. So once you have this Bean that will have a type of ProducerFactory, this Bean can be used to create a KafkaTemplate. So that's pretty much it, that what you need to do in your application. Provide a configuration for your application, create a default ProducerFactory, or just whatever ProducerFactory. ProducerFactory is interface and the default ProducerFactory it is a implementation of this interface. And after that, we're going to use this information to create KafkaTemplate. I mentioned earlier that the KafkaTemplate provides multiple different methods that allows you to focus on particular use case. In your application configuration can define a full topic. In this case you just use method send that will have a value. You can define a topic for your message and so far and so on. And Kafka is asynchronous system by default. And what the method send, it always returns a future that allows you to, introspect the result somewhere in future. Spring provides its own facility for handling this asynchronous communication. So that's why methods of KafkaTemplates return a listenable future. You can also, you can also define a callback so you will introspect the results of this. Or you can always do, you know ListenableFuture.get so you will get said result immediately. All right, so that's it with the KafkaTemplate. With the next module, you will learn how you can use this in practice and follow me on the walkthrough where I'll be showing you how you can send the messages in the Kafka Topic, where your Spring Boot would be running in your local computer and your Kafka will be running in Confluent Cloud. And for now, I'm Victor Gamov, and as always, have a nice day.