Consider a situation where you want to direct the output of different records to different topics, like a "topic exchange." In this tutorial, you'll learn how to instruct Kafka Streams to choose the output topic at runtime, based on information in each record's header, key, or value.
builder.stream(INPUT_TOPIC, Consumed.with(stringSerde, orderSerde))
.mapValues(orderProcessingSimulator)
.to(orderTopicNameExtractor, Produced.with(stringSerde, completedOrderSerde));Here's our example topology. To dynamically route records to different topics, you'll use an instance of the TopicNameExtractor. As shown in here, you provide the TopicNameExtractor to the overloaded KStream.to.
Here's the TopicNameExtractor used in this example. It uses information from the value to determine which topic Kafka Streams should use for this record.
final TopicNameExtractor<String, CompletedOrder> orderTopicNameExtractor = (key, completedOrder, recordContext) -> {
final String compositeId = completedOrder.id();
final String skuPart = compositeId.substring(compositeId.indexOf('-') + 1, 5);
final String outTopic;
if (skuPart.equals("QUA")) {
outTopic = SPECIAL_ORDER_OUTPUT_TOPIC;
} else {
outTopic = OUTPUT_TOPIC;
}
return outTopic;
};The TopicNameExtractor interface has one method, extract, which makes it suitable for using a lambda, as shown here. But remember using a concrete class has the advantage of being directly testable.
The following steps use Confluent Cloud. To run the tutorial locally with Docker, skip to the Docker instructions section at the bottom.
git clone git@github.com:confluentinc/tutorials.git
cd tutorialsLogin to your Confluent Cloud account:
confluent login --prompt --saveInstall a CLI plugin that will streamline the creation of resources in Confluent Cloud:
confluent plugin install confluent-quickstartRun the plugin from the top-level directory of the tutorials repository to create the Confluent Cloud resources needed for this tutorial. Note that you may specify a different cloud provider (gcp or azure) or region. You can find supported regions in a given cloud provider by running confluent kafka region list --cloud <CLOUD>.
confluent quickstart \
--environment-name kafka-streams-dynamic-output-topic-env \
--kafka-cluster-name kafka-streams-dynamic-output-topic-cluster \
--create-kafka-key \
--kafka-java-properties-file ./dynamic-output-topic/kstreams/src/main/resources/cloud.propertiesThe plugin should complete in under a minute.
Create the input and output topics for the application:
confluent kafka topic create dynamic-topic-input
confluent kafka topic create dynamic-topic-output
confluent kafka topic create special-order-outputStart a console producer:
confluent kafka topic produce dynamic-topic-inputEnter a few JSON-formatted orders:
{"id":6, "sku":"COF0003456", "name":"coffee", "quantity":1}
{"id":7, "sku":"QUA000022334", "name":"hand sanitizer", "quantity":2}Enter Ctrl+C to exit the console producer.
Compile the application from the top-level tutorials repository directory:
./gradlew dynamic-output-topic:kstreams:shadowJarNavigate into the application's home directory:
cd dynamic-output-topic/kstreamsRun the application, passing the Kafka client configuration file generated when you created Confluent Cloud resources:
java -cp ./build/libs/dynamic-output-topic-standalone.jar \
io.confluent.developer.KafkaStreamsDynamicOutputTopic \
./src/main/resources/cloud.propertiesValidate that you see the first order in the dynamic-topic-output topic and the second in the special-order-output topic.
confluent kafka topic consume dynamic-topic-output -bconfluent kafka topic consume special-order-output -bWhen you are finished, delete the kafka-streams-dynamic-output-topic-env environment by first getting the environment ID of the form env-123456 corresponding to it:
confluent environment listDelete the environment, including all resources created for this tutorial:
confluent environment delete <ENVIRONMENT ID>git clone git@github.com:confluentinc/tutorials.git
cd tutorialsStart Kafka with the following command run from the top-level tutorials repository directory:
docker compose -f ./docker/docker-compose-kafka.yml up -dOpen a shell in the broker container:
docker exec -it broker /bin/bashCreate the input and output topics for the application:
kafka-topics --bootstrap-server localhost:9092 --create --topic dynamic-topic-input
kafka-topics --bootstrap-server localhost:9092 --create --topic dynamic-topic-output
kafka-topics --bootstrap-server localhost:9092 --create --topic special-order-outputStart a console producer:
kafka-console-producer --bootstrap-server localhost:9092 --topic dynamic-topic-inputEnter a few JSON-formatted orders:
{"id":6, "sku":"COF0003456", "name":"coffee", "quantity":1}
{"id":7, "sku":"QUA000022334", "name":"hand sanitizer", "quantity":2}Enter Ctrl+C to exit the console producer.
On your local machine, compile the app:
./gradlew dynamic-output-topic:kstreams:shadowJarNavigate into the application's home directory:
cd dynamic-output-topic/kstreamsRun the application, passing the local.properties Kafka client configuration file that points to the broker's bootstrap servers endpoint at localhost:9092:
java -cp ./build/libs/dynamic-output-topic-standalone.jar \
io.confluent.developer.KafkaStreamsDynamicOutputTopic \
./src/main/resources/local.propertiesValidate that you see the first order in the dynamic-topic-output topic and the second in the special-order-output topic.
kafka-console-consumer --bootstrap-server localhost:9092 --topic dynamic-topic-output --from-beginningkafka-console-consumer --bootstrap-server localhost:9092 --topic special-order-output --from-beginningFrom your local machine, stop the broker container:
docker compose -f ./docker/docker-compose-kafka.yml down