Get Started Free

Kafka Topics & Producers FAQs

Frequently asked questions and answers about Kafka topics and partitions, and how records, logs, and data is stored in Kafka.

What is a Kafka topic?

A Kafka topic describes how messages are organized and stored. Topics are defined by developers and often model entities and event types. You can store more than one event type in a topic if appropriate for the implementation.

Kafka topics can broadly be thought of in the same way as tables in a relational database, which are used to model and store data. Some examples of Kafka topics would be:

  • orders
  • website_clicks
  • network_events
  • customers

Topics can be partitioned, and partitions are spread across the available Kafka brokers.

How do I read data from a Kafka topic?

To read data from a Kafka topic in your application, use the Consumer API provided by one of the client libraries (for example Java, C/C++, C#, Python, Go, Node.js, or Spring Boot).

You can also read data from a Kafka topic using a command-line interface (CLI) tool such as kcat (formerly known as kafkacat) or kafka-console-consumer.

Confluent also provides a web interface for browsing messages in a Kafka topic, available on-premises and on Confluent Cloud.

How do I list Kafka topics?

To list Kafka topics use the kafka-topics command-line tool:

./bin/kafka-topics --bootstrap-server localhost:9092 --list

Using Confluent you can also view a list of topics with your web browser.

How many partitions to put in a Kafka topic?

Many Kafka users have settled on between 12 and 24 partitions per topic, but there really is no single answer that works for every situation.

There are a few key principles that will help you in making this decision, but ultimately, performance testing with various numbers of partitions is the safest route:

  • The topic partition is the unit of parallelism in Kafka
    • Writes to different partitions can be fully parallel
    • A partition can only be read by one consumer in a group, so a consumer group can only effectively grow to the number of partitions
  • Increasing the number of partitions can be a costly exercise, for two reasons:
    • New partitions will remain empty until the data is manually redistributed
    • Ordering guarantees within a partition will be lost
  • Higher partition counts will require more resources on brokers and clients
    • Partitions are backed by index and data files, so more partitions means more open file handles
    • Producers will buffer records for each partition before sending them, so more partitions will require more memory for those buffers

Note that KRaft removes the metadata bottleneck for clusters with a large number of partitions, however performance for these partitions are still dependent on nodes available in the cluster

You can read more in this blog post by Jun Rao (one of the original creators of Apache Kafka®).

How to create a topic in Kafka

You can create a Kafka topic with the kafka-topics.sh command-line tool:


./bin/kafka-topics.sh --create --partitions 1 --replication-factor 1 \
                      --topic my-topic --bootstrap-server localhost:9092

You can also use the Confluent CLI to create a topic:

confluent kafka topic create <topic> [flags]

Another option is the Confluent Cloud Console, where you can simply click the Create topic button on the Topics page.

How many topics can be created in Kafka?

While there is no set limit to the number of topics that can exist in a Kafka cluster, currently Kafka can handle hundreds of thousands of topics, depending on the number of partitions in each.

With Kafka's new KRaft mode, that number will be in the millions.

How to delete a Kafka topic

You can delete a Kafka topic with the kafka-topics.sh tool:

./bin/kafka-topics.sh --delete --topic my-topic \
                      --bootstrap-server localhost:9092

You can also use the Confluent CLI to delete a topic:

confluent kafka topic delete my-topic [flags]

Another option is the web-based Confluent Cloud Console, where you can click on the topic on the Topics page, then go to the Configuration tab and click Delete topic.

How do I count the number of messages in a topic?

To count the number of messages in a Kafka topic, you should consume the messages from the beginning of the topic and increment a counter.

This Kafka Tutorial shows some specific examples using either a command-line tool or ksqlDB.

For further discussion see this blog post.

How do I clear a Kafka topic?

To delete the contents of a Kafka topic, do the following:

  1. Change the retention time on the topic:

    ./bin/kafka-configs --bootstrap-server localhost:9092 --alter \
                        --entity-type topics --entity-name my_topic \
                        --add-config retention.ms=0
  2. Wait for the broker log manager process to run.

    If you inspect the broker logs, you'll see something like this:

    INFO [Log partition=orders-0, dir=/tmp/kafka-logs] 
      Found deletable segments with base offsets [0] due to 
      Retention time 0ms breach (kafka.log.Log)
  3. Restore the retention time on the topic to what it was previously, or remove it as shown here:

    ./bin/kafka-configs --bootstrap-server localhost:9092 --alter \
                        --entity-type topics --entity-name orders \
                        --delete-config retention.ms

A few things to be aware of when clearing a topic:

  • Are you trying to recreate a "message queue?" Then you might have the wrong model of a log—check out our free Kafka 101 course for a refresher.
  • Are you trying to clear out the topic as part of a test process? Are you aware that offsets cannot be reset even if messages are reaped from a topic? Also do you know that record deletion is nondeterministic? Maybe it’s possible to use ephemeral topics for your testing or short-term use cases. If you need the same topic name, that could be a code/configuration smell.

Learn more with these free training courses

Apache Kafka® 101

Learn how Kafka works, how to use it, and how to get started.

Spring Framework and Apache Kafka®

This hands-on course will show you how to build event-driven applications with Spring Boot and Kafka Streams.

Building Data Pipelines with Apache Kafka® and Confluent

Build a scalable, streaming data pipeline in under 20 minutes using Kafka and Confluent.

Confluent Cloud is a fully managed Apache Kafka service available on all three major clouds. Try it for free today.

Try it for free