Get Started Free

Apache Kafka® Quick Start

The guide below demonstrates how to quickly get started with Apache Kafka. You'll connect to a broker, create a topic, produce some messages, and consume them. Be sure to also check out the client code examples to learn more.

Docker

1. Set up a Kafka broker

The Docker Compose file below will run everything for you via Docker. Copy and paste it into a file named docker-compose.yml on your local filesystem. Note that this quickstart runs Kafka with ZooKeeper while Kafka Raft (KRaft) is in preview for Confluent Platform.

---
version: '3'
services:
  zookeeper:
    image: confluentinc/cp-zookeeper:7.3.2
    container_name: zookeeper
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000

  broker:
    image: confluentinc/cp-kafka:7.3.2
    container_name: broker
    ports:
    # To learn about configuring Kafka for access across networks see
    # https://www.confluent.io/blog/kafka-client-cannot-connect-to-broker-on-aws-on-docker-etc/
      - "9092:9092"
    depends_on:
      - zookeeper
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_INTERNAL:PLAINTEXT
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092,PLAINTEXT_INTERNAL://broker:29092
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
      KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1
      KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1

2. Start the Kafka broker

From a directory containing the docker-compose.yml file created in the previous step, run this command to start all services in the correct order.

docker compose up -d

3. Create a topic

Kafka stores messages in topics. It’s good practice to explicitly create them before using them, even if Kafka is configured to automagically create them when referenced.

Run this command to create a new topic into which we’ll write and read some test messages.

docker exec broker \
kafka-topics --bootstrap-server broker:9092 \
             --create \
             --topic quickstart
Stuck? Let the Confluent Community help you.

4. Write messages to the topic

You can use the kafka-console-producer command line tool to write messages to a topic. This is useful for experimentation (and troubleshooting), but in practice you’ll use the Producer API in your application code, or Kafka Connect for pulling data in from other systems to Kafka.

Run this command. You’ll notice that nothing seems to happen—fear not! It is waiting for your input.

docker exec --interactive --tty broker \
kafka-console-producer --bootstrap-server broker:9092 \
                       --topic quickstart

Type in some lines of text. Each line is a new message.

this is my first kafka message
hello world!
this is my third kafka message. I’m on a roll :-D

When you’ve finished, enter Ctrl-C to return to your command prompt.

5. Read messages from the topic

Now that we’ve written message to the topic, we’ll read those messages back. Run this command to launch the kafka-console-consumer. The --from-beginning argument means that messages will be read from the start of the topic.

docker exec --interactive --tty broker \
kafka-console-consumer --bootstrap-server broker:9092 \
                       --topic quickstart \
                       --from-beginning

As before, this is useful for trying things on the command line, but in practice you’ll use the Consumer API in your application code, or Kafka Connect for reading data from Kafka to push to other systems.

You’ll see the messages that you entered in the previous step.

this is my first kafka message
hello world!
this is my third kafka message. I’m on a roll :-D

6. Write some more messages

Leave the kafka-console-consumer command from the previous step running. If you’ve already closed it, just rerun it.

Now open a new terminal window and run the kafka-console-producer again.

docker exec --interactive --tty broker \
kafka-console-producer --bootstrap-server broker:9092 \
                       --topic quickstart

Enter some more messages and note how they are displayed almost instantaneously in the consumer terminal.

Enter Ctrl-C in the producer and consumer terminals to exit each client program.

7. Stop the Kafka broker

Once you’ve finished, you can shut down the Kafka broker. Note that doing this will destroy all messages that you’ve written.

From a directory containing the docker-compose.yml file created earlier, run this command to stop all services in the correct order.

docker compose down

What's Next

  • Build Apps
  • Build Pipelines
  • Operate
Build Apps
  • Select a value
  • Build Apps
  • Build Pipelines
  • Operate

Tutorials with Full Code Examples

Learn the basics

Step through the basics of the CLI, Kafka topics, and building applications.

Explore top use cases

Run pre-built ksqlDB recipes that tackle the highest impact use cases for stream processing

Master advanced concepts

Learn how to route events, manipulate streams, aggregate data, and more.

Get Started with Kafka Clients

Write your first application using these full code examples in Java, Python, Go, .NET, Node.js, C/C++, REST, Spring Boot, and further languages and CLIs.

Top 3 Courses for Application Developers

Confluent Cloud is a fully managed Apache Kafka service available on all three major clouds. Try it for free today.

Try it for free

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.