Get Started Free

Apache Kafka® FAQs

Here are some of the questions that you may have about Apache Kafka and its surrounding ecosystem.

If you’ve got a question that isn’t answered here then please do ask the community.

How to install Kafka

Confluent Platform includes Apache Kafka. You can download and unpack the community edition by running this command:

tar -xf confluent-community-7.0.0.tar.gz

Confluent provides multiple ways to install and run Kafka, including Docker, ZIP, TAR, Kubernetes, and Ansible.

A great alternative to having to install and run Kafka yourself is to use the fully managed cloud service provided by Confluent Cloud.

How to run Kafka on Windows

The recommended approach for running Kafka on Windows is to run it under WSL 2 (Windows Subsystem for Linux). This blog post provides step-by-instructions showing you how to run Kafka on Windows.

Note that while this is fine for trying out Kafka, Windows isn’t a recommended platform for running Kafka with production workloads. If you are using Windows and want to run Kafka in production, then a great option is to use Confluent Cloud along with the provided Confluent CLI (which is supported on Microsoft Windows) to interact with it.

Where is Kafka installed?

If you install Kafka using the .tar.gz package from the Kafka website, it will be in whichever folder you unpack the contents into. The same applies if you install Confluent Platform manually.

If you install Confluent Platform using the RPM or Debian packages, then you will by default find the data files in /var/lib and configuration files in /etc (e.g. /etc/kafka).

For more details on specific paths and installation methods see the documentation.

How do I run Kafka on Docker?

  1. Paste the Docker Compose file below into a file called docker-compose.yml.

    version: '3'
        image: confluentinc/cp-zookeeper:7.0.0
        container_name: zookeeper
          ZOOKEEPER_TICK_TIME: 2000
        image: confluentinc/cp-kafka:7.0.0
        container_name: broker
        # To learn about configuring Kafka for access across networks see
          - "9092:9092"
          - zookeeper
          KAFKA_BROKER_ID: 1
          KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
  2. Run docker-compose up -d.

For more details see the Docker quick start.

There are many other Docker images you can run alongside the broker including one for a Kafka Connect worker, a ksqlDB instance, a REST proxy, Schema Registry, etc. For a Confluent Platform demo that includes those services along with a greater set of features, including security, Role-Based Access Control, Kafka clients, and Confluent Replicator, see cp-demo.

How do I run Kafka on Kubernetes?

Confluent for Kubernetes (CFK) runs on Kubernetes and provides a cloud-native control plane with a declarative Kubernetes-native API approach to configure, deploy, and manage Apache Kafka®, Connect workers, ksqlDB, Schema Registry, Confluent Control Center, and resources like topics and rolebindings, through Infrastructure as Code (IaC).

To install CFK, run the following:

helm repo add confluentinc
helm repo update
helm upgrade --install confluent-operator confluentinc/confluent-for-kubernetes

For an overview of Kafka on Kubernetes, see this talk from Kafka Summit.

How do I start ZooKeeper in Confluent?

Once you have downloaded Confluent Platform, start ZooKeeper using the Confluent CLI.

confluent local services zookeeper start

Or you can start the whole platform all at once:

confluent local services start

Keep in mind that you can also launch the Apache Kafka broker in KRaft mode (which is experimental as of Confluent Platform 7.0.0), which means that it runs without ZooKeeper. See this page for more details.

Learn more with these free training courses

Apache Kafka 101

Learn how Kafka works, how to use it, and how to get started.

Spring Framework and Apache Kafka®

This hands-on course will show you how to build event-driven applications with Spring Boot and Kafka Streams.

Building Data Pipelines with Apache Kafka® and Confluent

Build a scalable, streaming data pipeline in under 20 minutes using Kafka and Confluent.

Confluent Cloud is a fully managed Apache Kafka service available on all three major clouds. Try it for free today.

Try it for free