course: Apache Kafka® 101

Hands On: Your First Kafka Application in 10 Minutes or Less

10 min
Danica Fine

Danica Fine

Senior Developer Advocate (Presenter)

Getting Started with Apache Kafka

Start Apache Kafka in the easiest, fastest way possible using Confluent Cloud in this Hello World style, beginner's quick start tutorial.

  1. Begin by setting up Confluent Cloud.
  2. Create your first Kafka topic, put some messages into it, and read your messages out using both the Confluent Cloud Console and the Confluent CLI.
  3. Finally, produce additional messages to your topic directly from the CLI, viewing real-time data streams in both another CLI window and the Confluent Cloud Console.

Set up Confluent Cloud

  1. Head over to the Confluent Cloud signup page and enter your name, email address, and password.

  2. Click the Start Free button. (Make sure to keep track of your password, as you'll need it to log in to Confluent Cloud later on.)

    sign-up

  3. Watch your inbox for a confirmation email. Once you get the email, follow the link to proceed to the next step, where you should choose a Basic cluster.

    create-cluster

Basic clusters used in the context of this exercise won't incur much cost, and the amount of free usage that you receive along with the promo code KAFKA101 for $101 of free Confluent Cloud usage will be more than enough to cover it.* At the end of the course, we'll walk you through how to delete the cluster to avoid any future billing. Select Begin configuration to start.

  1. On the next page, choose your cloud provider, region, and availability (zone). Costs will vary with these choices, but they are clearly shown on the dropdown, so you'll know what you're getting.

cluster-cloud-provider-and-region

  1. On the next screen, click Review to get one last look at the choices you've made. If everything checks out, give your cluster a name, and select Launch cluster.

    launch-cluster

  2. While your cluster is being provisioned, set up the KAFKA101 promo code by navigating to Billing & payment from the settings menu in the upper right. On that screen, go to the Payment details & contacts tab to enter the promo code.

    billing-and-payment

Create Your First Kafka Topic on Confluent Cloud

  1. From the Confluent Cloud landing page, select the Topics tab on the left-hand side of the screen, then choose Create topic.

    create-topic

  2. Name your topic "poems." The default number of partitions for a topic is six, which works well for today’s use case, so go ahead and select Create with defaults.

    create-with-defaults

  3. In the next screen, which displays your topic, select the Messages tab to view the contents of the topic (which is empty at this point). Select Produce a new message to this topic: This will open a UI that lets you enter a key and value for a new message (remember that a message, or an event, is a key/value pair).

  4. Delete the existing data in the key and value fields, and enter "1” for the key. For the value, enter a line from a poem that may sound familiar, such as, "All that is gold does not glitter." In the Jump to offset field above, enter "0" and then select "0/Partition:3." Click Produce to add the message to your topic.

    add-message

  5. In a similar fashion, add the next few lines of the poem:
    2, "Not all who wander are lost"
    3, "The old that is strong does not wither"
    4, "Deep roots are not harmed by the frost"

  6. The four messages will be distributed amongst the six partitions of the topic. Using the Jump to offset field, explore the remaining partitions of the topic and see the partition in which these messages were written.

Set Up the Confluent CLI

  1. From Confluent Cloud, select CLI and tools from the lower left-hand corner of the screen. From here, you’ll find instructions on how to download and update the command line tools that we’ll be using.

    cli-and-tools-cloud

  2. Paste the curl command into a terminal to install the CLI.

curl -L --http1.1 https://cnfl.io/cli | sh -s -- -b /usr/local/bin
  1. You'll receive the latest version, but note that it is a good idea once in a while to update the CLI with the following:
confluent update
  1. From a terminal, log in to Confluent Cloud with the credentials that you used to create your Confluent Cloud account. (The --save flag saves your login details locally so that you don’t have to reenter your credentials so frequently.)
confluent login --save
  1. Next, determine your Confluent environment by running:
confluent environment list

If your account is new, you should expect to only see one environment. Observe the output from this command, particularly the ID field.

  1. Using ID value from the previous step, run:
confluent environment use {ID}
  1. Similarly, list out all of the Kafka clusters available to you using the following:
confluent kafka cluster list

Again, observe the ID that’s output. Then set the Kafka cluster by running:

confluent kafka cluster use {ID}
  1. In order to communicate with our Kafka cluster, we need to provide an API key and secret for the CLI to use. Using the cluster ID from step 6, run:
confluent api-key create --resource {ID}

This command will output an API key and secret; save these securely somewhere. To tell the CLI to use the API key, gather the cluster ID along with the API key and execute:

confluent api-key use {API Key} --resource {ID}

Now your CLI is set up and ready to use!

Produce and Consume Using the Confluent CLI

If you have followed this exercise chronologically, you now have a topic with events on Confluent Cloud, as well as an authenticated CLI. Now you can consume from Confluent Cloud in your CLI and also produce to Confluent Cloud from your CLI.

  1. From a terminal window, list out all of the topics available to you:
confluent kafka topic list

You should see the poems topics that we created earlier. 2. Consume messages from the poems topic.

confluent kafka topic consume --from-beginning poems

The --from-beginning flag tells the consumer to start from the earliest known offset on the topic, i.e., the earliest message. Leave this consumer running in the terminal window. 3. From another terminal window, begin to produce more messages to the topic. Execute the produce command with the --parse-key flag to automatically read both keys and values separated by the “:” symbol.

confluent kafka topic produce poems --parse-key

When prompted, enter the following strings as written:

	5:"From the ashes a fire shall awaken"
	6:"A light from the shadows shall spring"
	7:"Renewed shall be blad that was broken"
	8:"The crownless again shall be king"
  1. Observe the messages as they’re being output in the consumer terminal window.
  2. Navigate to Confluent Cloud. From the poems topic overview page, select the Messages tab. In the Jump to offset field, enter "0" and select various partitions to observe where the new messages have been written.

And that's it! If you've followed this exercise all the way through, you accomplished a number of things: You signed up for Confluent Cloud and created a cluster, created a topic, added messages to the topic using the web console, installed the CLI and created an API key, and finally used the CLI producer and consumer. Phew!

With that, you are well on your way to building great things with Confluent Cloud. Continue learning by following along in the remaining modules and exercises.

Use the promo code KAFKA101 to get $101 of free Confluent Cloud usage

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.