Developer Advocate (Presenter)
Note: This exercise is part of a larger course. You are expected to have completed the previous exercises.
This exercise uses the library Java Faker to produce from your Java application to an Apache Kafka topic on Confluent Cloud. Note that this builds on the Confluent Cloud/Spring Boot installation you did in the Introduction to Spring Boot for Confluent Cloud exercise. You can see the code for modules 1–10 in a combined GitHub repo and you can also refer there for a list of imports as well as a sample build.gradle
file.
Go to build.gradle
in your Java application and find the dependencies
object. Add Java Faker (you can find its installation string on GitHub):
implementation 'com.github.javafaker:javafaker:1.0.2'
Next, begin to set up Spring Boot to send messages to Kafka. Open SpringCcloudApplication.java
and create a Producer
class with a Lombok annotation for dependency injection:
@RequiredArgsConstructor
class Producer {
private final KafkaTemplate<Integer, String> template
}
(With respect to dependency injection, it’s good to use constructor dependency injection or property dependency injection, but not field dependency injection.)
Next, add Java Faker code to produce some messages (from “The Hobbit,” incidentally), and also add Flux
, a reactive library that lets you push a message per second:
@RequiredArgsConstructor
@Component
class Producer {
private final KafkaTemplate<Integer, String> template;
Faker faker;
@EventListener(ApplicationStartedEvent.class)
public void generate() {
faker = Faker.instance();
final Flux<Long> interval = Flux.interval(Duration.ofMillis(1_000));
final Flux<String> quotes = Flux.fromStream(Stream.generate(() -> faker.hobbit().quote()));
Flux.zip(interval, quotes)
.map(it -> template.send("hobbit", faker.random().nextInt(42), it.getT2())).blockLast();
}
}
Note that Flux.zip
lets you combine the two Flux parameters and enables you to send your messages to the topic hobbit
on Confluent Cloud. Faker.random()
generates your keys and the @EventListener
annotation from Spring runs the class when the application is started.
Now you need to tell Confluent Cloud the serialization method to use for keys and values by specifying it in application.properties
:
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.IntegerSerializer
spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer
hobbit
, using Create with defaults.We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.