Get Started Free
Tutorial

How to filter messages in a Kafka topic with Kafka Streams

How to filter messages in a Kafka topic with Kafka Streams

Consider a topic with events, and you want to filter out records not matching a given attribute.

 builder.stream(INPUT_TOPIC, Consumed.with(Serdes.String(), publicationSerde))
        .filter((name, publication) -> "George R. R. Martin".equals(publication.name()))
        .to(OUTPUT_TOPIC, Produced.with(Serdes.String(), publicationSerde));

To keep only records in the event stream matching a given predicate (either the key or the value), you'll use the KStream.filter.
For retaining records that do not match a predicate you can use KStream.filterNot

The following steps use Confluent Cloud. To run the tutorial locally with Docker, skip to the Docker instructions section at the bottom.

Prerequisites

  • A Confluent Cloud account
  • The Confluent CLI installed on your machine
  • Apache Kafka or Confluent Platform (both include the Kafka Streams application reset tool)
  • Clone the confluentinc/tutorials repository and navigate into its top-level directory:
    git clone git@github.com:confluentinc/tutorials.git
    cd tutorials

Create Confluent Cloud resources

Login to your Confluent Cloud account:

confluent login --prompt --save

Install a CLI plugin that will streamline the creation of resources in Confluent Cloud:

confluent plugin install confluent-quickstart

Run the plugin from the top-level directory of the tutorials repository to create the Confluent Cloud resources needed for this tutorial. Note that you may specify a different cloud provider (gcp or azure) or region. You can find supported regions in a given cloud provider by running confluent kafka region list --cloud <CLOUD>.

confluent quickstart \
  --environment-name kafka-streams-filtering-env \
  --kafka-cluster-name kafka-streams-filtering-cluster \
  --create-kafka-key \
  --kafka-java-properties-file ./filtering/kstreams/src/main/resources/cloud.properties

The plugin should complete in under a minute.

Create topics

Create the input and output topics for the application:

confluent kafka topic create filtering-input
confluent kafka topic create filtering-output

Start a console producer:

confluent kafka topic produce filtering-input

Enter a few JSON-formatted books:

{"name":"George R. R. Martin", "title":"A Song of Ice and Fire"}
{"name":"C.S. Lewis", "title":"The Silver Chair"}
{"name":"C.S. Lewis", "title":"Perelandra"}
{"name":"George R. R. Martin", "title":"Fire & Blood"}
{"name":"J. R. R. Tolkien", "title":"The Hobbit"}

Enter Ctrl+C to exit the console producer.

Compile and run the application

Compile the application from the top-level tutorials repository directory:

./gradlew filtering:kstreams:shadowJar

Navigate into the application's home directory:

cd filtering/kstreams

Run the application, passing the Kafka client configuration file generated when you created Confluent Cloud resources:

java -cp ./build/libs/kstreams-filter-standalone.jar \
    io.confluent.developer.FilterEvents \
    ./src/main/resources/cloud.properties

Validate that you see only the books by George R. R. Martin in the filtering-output topic.

confluent kafka topic consume filtering-output -b

You should see:

{"name":"George R. R. Martin","title":"A Song of Ice and Fire"}
{"name":"George R. R. Martin","title":"Fire & Blood"}

Clean up

When you are finished, delete the kafka-streams-filtering-env environment by first getting the environment ID of the form env-123456 corresponding to it:

confluent environment list

Delete the environment, including all resources created for this tutorial:

confluent environment delete <ENVIRONMENT ID>
Docker instructions

Prerequisites

  • Docker running via Docker Desktop or Docker Engine
  • Docker Compose. Ensure that the command docker compose version succeeds.
  • Clone the confluentinc/tutorials repository and navigate into its top-level directory:
    git clone git@github.com:confluentinc/tutorials.git
    cd tutorials

Start Kafka in Docker

Start Kafka with the following command run from the top-level tutorials repository directory:

docker compose -f ./docker/docker-compose-kafka.yml up -d

Create topics

Open a shell in the broker container:

docker exec -it broker /bin/bash

Create the input and output topics for the application:

kafka-topics --bootstrap-server localhost:9092 --create --topic filtering-input
kafka-topics --bootstrap-server localhost:9092 --create --topic filtering-output

Start a console producer:

kafka-console-producer --bootstrap-server localhost:9092 --topic filtering-input

Enter a few JSON-formatted books:

{"name":"George R. R. Martin", "title":"A Song of Ice and Fire"}
{"name":"C.S. Lewis", "title":"The Silver Chair"}
{"name":"C.S. Lewis", "title":"Perelandra"}
{"name":"George R. R. Martin", "title":"Fire & Blood"}
{"name":"J. R. R. Tolkien", "title":"The Hobbit"}

Enter Ctrl+C to exit the console producer.

Compile and run the application

On your local machine, compile the app:

./gradlew filtering:kstreams:shadowJar

Navigate into the application's home directory:

cd filtering/kstreams

Run the application, passing the local.properties Kafka client configuration file that points to the broker's bootstrap servers endpoint at localhost:9092:

java -cp ./build/libs/kstreams-filter-standalone.jar \
    io.confluent.developer.FilterEvents \
    ./src/main/resources/local.properties

Validate that you see only the books by George R. R. Martin in the filtering-output topic. In the broker container shell:

kafka-console-consumer --bootstrap-server localhost:9092 --topic filtering-output --from-beginning

You should see:

{"name":"George R. R. Martin","title":"A Song of Ice and Fire"}
{"name":"George R. R. Martin","title":"Fire & Blood"}

Clean up

From your local machine, stop the broker container:

docker compose -f ./docker/docker-compose-kafka.yml down
Do you have questions or comments? Join us in the #confluent-developer community Slack channel to engage in discussions with the creators of this content.