Course: Apache Kafka® 101

Hands On: Get Started with Kafka in Minutes with Confluent Cloud

7 min
Tim BerglundSr. Director, Developer Advocacy (Course Presenter)

Hands On: Get Started with Kafka in Minutes with Confluent Cloud

Start Apache Kafka® the easiest, fastest way possible using Confluent Cloud in this hello-world style, absolute beginner's quick start tutorial.

  1. Begin by setting up Confluent Cloud.
  2. Create your first Apache Kafka topic, put some messages into it, and read your messages out using both the Confluent Cloud Console and the Confluent Cloud CLI.
  3. Finally, produce additional messages to your topic directly from the CLI, viewing real-time data streams in both another CLI window and the Confluent Cloud Console.

Set Up Confluent Cloud

  1. Begin by heading over to the Confluent Cloud signup page and use the promo code KAFKA101 for $101 of free usage.
  2. Enter your name, email address and password.
  3. Click the Start Free button. (Make sure to keep track of your password, as you'll need it to log into Confluent Cloud later on.)

confluent-cloud-signup

  1. Watch your inbox for a confirmation email. Once you get the email, follow the link to proceed to the next step, where you can choose a Basic, Standard or Dedicated cluster.

create-cluster

Review the associated costs, but keep in mind that the amount of free cloud usage credit you receive with KAFKA101 will be more than enough to cover this exercise. Once you have made your selection, select Begin configuration.

  1. On the next page, choose your Cloud Provider, Region, and Availability (Zone). Costs will vary with these choices, but they are clearly shown on the bottom of the screen, so you'll know what you're getting.

create-cluster-step-two

  1. Continue to set up billing information. By entering KAFKA101 here as a promo code, you will receive an additional $101 of free usage. On the next screen, click Review to get one last look at the choices you've made.
  2. If everything checks out, give your cluster the name "Kafka 101" and select Launch cluster.

create-cluster-step-three

  1. It might take a minute or two for your cluster to provision, since quite a bit of work is being performed for you behind the scenes. (It would take a long time to stand up a Kafka cluster yourself, especially the first time.)

Set Up the Confluent Cloud CLI

  1. After you launch your cluster, you will be shown the following screen:

cluster

Click Set up the CLI. Confluent Cloud will detect your operating system and will provide a curl command for downloading the CLI:

set-up-cli

  1. Paste the curl command into a terminal to install the CLI.

    curl -L --http1.1 https://cnfl.io/ccloud-cli | sh -s -- -b /usr/local/bin
  2. You'll receive the latest version but note that it is a good idea once in a while to update the CLI with:

    ccloud update

Create Your First Topic on Confluent Cloud

  1. Return to the Confluent Cloud web UI and select Topics from the left-hand menu, then click Create topic.:

first-topics

  1. Name your topic "Poems." Normally you would leave the default partition count of "6" but enter "1" in this case (you'll learn more about partitioning later in the module Partitioning). Select Create with defaults.

new-topic

  1. In the next screen, which displays your topic, select the "Messages" tab to view the contents of the topic (empty at this point). Select Produce a new message to this topic:

poems

This will open a UI that lets you enter a key and value for a new message (remember that a message, or an event, is a key/value pair).

  1. Delete the existing data in the Key and Value fields, and enter "1" for Key. For Value, enter a line from a poem that may sound familiar: "All that is gold does not glitter." In the Jump to offset field above, enter "0", and then select "0/Partition:0". Click Produce to add the message to your topic.

add-message-your-topic

  1. Add the next three lines of the poem in the same fashion:

    "2", "Not all who wander are lost"
    "3", "The old that is strong does not wither"
    "4", "Deep roots are not harmed by the frost"

Add Your Confluent Cloud Credentials to the CLI

  1. On the left-hand menu in Confluent Cloud, select API Access. Then click Create key.

api-keys

  1. Select the Global access scope and click Next:

create-key-access-control

  1. Copy your key and secret to a scratch pad, give your key the description "General”, and click Save. Then click Download and continue. Your credentials will be downloaded and you will be returned to a list of keys.

create-key-get-api-keys

  1. Return to the terminal and log into the CLI by entering the email address and password that you provided for Confluent Cloud earlier in the exercise (note that this is not the API key you just generated):

    ccloud login --save
  2. Next, get a list of the Kafka clusters in your account, which should be just one:

    ccloud kafka cluster list

    Note the Id of the cluster.

id-cluster

  1. Set your cluster as the default so you don't need to keep naming it with every command:

    ccloud kafka cluster use {Id}
  2. Add the API key that you got from Confluent Cloud:

    ccloud api-key store {API Key}

    After pressing Enter, you will be prompted to add the secret.

  3. Set the API key to use as the default for your only cluster:

    ccloud api-key use {API Key} --resource {cluster Id}

Use the CLI to Consume From and Produce to Your Topic on Confluent Cloud

If you have followed this exercise chronologically, you now have a topic with events on Confluent Cloud, as well as an authenticated CLI. Now you can consume from Confluent Cloud in your CLI and also produce to Confluent Cloud from your CLI.

  1. Open a new terminal window (also keeping the one you used to authenticate to Confluent Cloud open), and enter a command to consume from your poems topic:

    ccloud kafka topic consume --from-beginning poems
  2. Back in your first terminal window, start a command-line producer to your poems topic on Confluent Cloud:

    ccloud kafka topic produce --parse-key --delimiter : --value-format string poems
  3. Now produce the next four lines of the poem to your poems topic using the open producer:

    5:"From the ashes a fire shall awaken"
    6:"A light from the shadows shall spring"
    7:"Renewed shall be blad that was broken"
    8:"The crownless again shall be king"

    As you produce, the consumer in your other window reads your messages:

ccloud-kafka

(Looks like Aragorn will be returning to the throne of Gondor after all.)
  1. Finally, return to the web UI on Confluent Cloud to see your newly produced messages there as well:

poems-messages

And that's it. If you've followed the exercise all the way through, you've gotten some great things done: You successfully signed up for Confluent Cloud and created a cluster, you installed the CLI, you created a topic, you added messages to the topic using the web UI, you created an API key and added it to the CLI, and you used the console producer and consumer. You are well on your way to building things with Confluent Cloud!

Use the promo code KAFKA101 to get $101 of free Confluent Cloud usage

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.