Senior Developer Advocate (Presenter)
Start Apache Kafka in the easiest, fastest way possible using Confluent Cloud in this Hello World style, beginner's quick start tutorial.
Head over to the Confluent Cloud signup page and enter your name, email address, and password.
Click the Start Free button. (Make sure to keep track of your password, as you'll need it to log in to Confluent Cloud later on.)
Watch your inbox for a confirmation email. Once you get the email, follow the link to proceed to the next step, where you should choose a Basic cluster.
Basic clusters used in the context of this exercise won't incur much cost, and the amount of free usage that you receive along with the promo code KAFKA101
for $25 of free Confluent Cloud usage (details) will be more than enough to cover it. At the end of the course, we'll walk you through how to delete the cluster to avoid any future billing. Select Begin configuration to start.
On the next screen, click Review to get one last look at the choices you've made. If everything checks out, give your cluster a name, and select Launch cluster.
While your cluster is being provisioned, set up the KAFKA101
promo code by navigating to Billing & payment from the settings menu in the upper right. On that screen, go to the Payment details & contacts tab to enter the promo code.
From the Confluent Cloud landing page, select the Topics tab on the left-hand side of the screen, then choose Create topic.
Name your topic "poems." The default number of partitions for a topic is six, which works well for today’s use case, so go ahead and select Create with defaults.
In the next screen, which displays your topic, select the Messages tab to view the contents of the topic (which is empty at this point). Select Produce a new message to this topic: This will open a UI that lets you enter a key and value for a new message (remember that a message, or an event, is a key/value pair).
Delete the existing data in the key and value fields, and enter "1” for the key. For the value, enter a line from a poem that may sound familiar, such as, "All that is gold does not glitter." In the Jump to offset field above, enter "0" and then select "0/Partition:3." Click Produce to add the message to your topic.
In a similar fashion, add the next few lines of the poem:
2, "Not all who wander are lost"
3, "The old that is strong does not wither"
4, "Deep roots are not harmed by the frost"
The four messages will be distributed amongst the six partitions of the topic. Using the Jump to offset field, explore the remaining partitions of the topic and see the partition in which these messages were written.
From Confluent Cloud, select CLI and tools from the lower left-hand corner of the screen. From here, you’ll find instructions on how to download and update the command line tools that we’ll be using.
Paste the curl
command into a terminal to install the CLI.
curl -L --http1.1 https://cnfl.io/cli | sh -s -- -b /usr/local/bin
confluent update
--save
flag saves your login details locally so that you don’t have to reenter your credentials so frequently.)confluent login --save
confluent environment list
If your account is new, you should expect to only see one environment. Observe the output from this command, particularly the ID
field.
ID
value from the previous step, run:confluent environment use {ID}
confluent kafka cluster list
Again, observe the ID
that’s output. Then set the Kafka cluster by running:
confluent kafka cluster use {ID}
ID
from step 6, run:confluent api-key create --resource {ID}
This command will output an API key and secret; save these securely somewhere. To tell the CLI to use the API key, gather the cluster ID along with the API key and execute:
confluent api-key use {API Key} --resource {ID}
Now your CLI is set up and ready to use!
If you have followed this exercise chronologically, you now have a topic with events on Confluent Cloud, as well as an authenticated CLI. Now you can consume from Confluent Cloud in your CLI and also produce to Confluent Cloud from your CLI.
confluent kafka topic list
You should see the poems topics that we created earlier.
2. Consume messages from the poems
topic.
confluent kafka topic consume --from-beginning poems
The --from-beginning
flag tells the consumer to start from the earliest known offset on the topic, i.e., the earliest message. Leave this consumer running in the terminal window.
3. From another terminal window, begin to produce more messages to the topic. Execute the produce command with the --parse-key
flag to automatically read both keys and values separated by the “:” symbol.
confluent kafka topic produce poems --parse-key
When prompted, enter the following strings as written:
5:"From the ashes a fire shall awaken"
6:"A light from the shadows shall spring"
7:"Renewed shall be blad that was broken"
8:"The crownless again shall be king"
And that's it! If you've followed this exercise all the way through, you accomplished a number of things: You signed up for Confluent Cloud and created a cluster, created a topic, added messages to the topic using the web console, installed the CLI and created an API key, and finally used the CLI producer and consumer. Phew!
With that, you are well on your way to building great things with Confluent Cloud. Continue learning by following along in the remaining modules and exercises.
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.