Get Started Free
course: Apache Kafka® Security

Hands On: Creating a Secure Connection to Your Kafka Cluster

7 min
dan-weston

Dan Weston

Senior Curriculum Developer

Creating a Secure Connection to Your Kafka Cluster

To get started quickly, let’s use Confluent Cloud to run Kafka for us.

  1. If you already have a Confluent Cloud account, feel free to use that, if not, let's sign up for a free account. Head to Confluent Cloud and sign in with your existing account, or click on Sign up and try it for free. On the sign-up page, enter your name, email, company, and password. Be sure to remember these sign-in details, as you’ll need them to access your account later.

  2. Click the START FREE button and wait to receive a confirmation email in your inbox.

  3. The link in the confirmation email will lead you to the next step where you’ll be prompted to set up your cluster; you can choose between a Basic, Standard, or Dedicated cluster. Basic and Standard clusters are serverless offerings where your free Confluent Cloud usage is only exhausted based on what you use—perfect for what we need today. You should have plenty of credit with your free promo code, but if you’re not sure, usage costs are clearly shown at the bottom of the screen.

Note

Once we are done looking at this first example, don’t forget to stop and delete any resources that you created to avoid exhausting your free usage. Then after this class, you are encouraged to come back and play with your cluster to test things out.

  1. Select the cloud platform you want to run your cluster on and the region that is closest to you, and click Continue.

  2. Take one last look at the choices you’ve made and give your cluster a name (this simulation uses the name "purchases"), then click Launch cluster! It may take a few minutes for your cluster to be provisioned. And that’s it!

  3. You’ll receive an email once your cluster is provisioned, but, in the meantime, let’s go ahead and leverage the promo code that we saw earlier: From settings, choose Billing & payment. You’ll see here that you have $400 of free Confluent Cloud usage. But if you select the Payment details & contacts tab, you can either add a credit card or choose to enter a promo code. You can enter 101SECURITY to get an additional $25 of free usage (details), which will be more than enough for this quick example. You can also use the promo code CONFLUENTDEV1 to delay entering a credit card for 30 days.

Now that our cluster is ready to go, let's see what it takes to securely connect and produce and consume some messages, as well as export the configuration you can use to connect a client.

  1. You’ll need to clone the exercises repository into a local directory. Run the following command:
git clone https://github.com/confluentinc/learn-kafka-courses.git

And change to the directory for this course:

cd learn-kafka-courses/fund-kafka-security
  1. You’ll also need to pull the Docker images for both the Confluent cp-server and ZooKeeper images:
docker pull confluentinc/cp-zookeeper:7.1.1-1-ubi8
docker pull confluentinc/cp-server:7.1.1-1-ubi8
  1. Next, you’ll head over to your Confluent Cloud cluster to get the information you need to connect and produce and consume messages. Once there, you’ll want to click on Data integration and Clients.

  2. Click on Java under the heading Choose your language.

  3. Once you select your client of choice, you will be presented with the configuration needed to connect. If you already created an API key you can copy the text and fill in the values yourself. If you don't have one created, you can use the Create Kafka cluster API key on the right side of the screen. Since this is a new cluster, we're going to do that. You'll want to give your API key a description so you know why it was created, for example, ours is titled "Fund-Kafka-Security-Course," then click Download and continue.

This will create the key, download it to your machine, and will also fill in the correct fields in the configuration snippet on the left.

  1. Copy the configuration and create a new file in your code editor. We already have the working directory we got from GitHub open, so all we need to do is create a new file and name it getting-started.properties, paste in the values, and save it.

  2. Head back to your Confluent Cloud cluster to create a new topic by clicking on Topics, then Create topic. We'll name it “purchases,” change it to having only one partition, and leave all the defaults by clicking Create with defaults.

  3. Head back to your code editor and open up the terminal. You’ll also want to make sure that you are in the fund-kafka-security directory.

  4. Rather than starting a full Docker container, we’re just going to reference the container for our command:

docker run -it -v $PWD/getting-started.properties:/share/getting-started.properties confluentinc/cp-kafka:7.1.1-1-ubi8 kafka-console-producer --bootstrap-server <local machine endpoint address>:9092 --topic purchases --producer.config /share/getting-started.properties

This will open up the console producer where you can begin to produce messages directly to your Confluent Cloud cluster. And since my configuration file is using SASL_SSL—which we’ll talk about later in the course—all my messages are encrypted from any prying eyes.

  1. Produce some messages:
Kafka
Rocks
  1. Now let’s go ahead and run our console consumer to see the messages we produced:
docker run -it -v $PWD/getting-started.properties:/share/getting-started.properties confluentinc/cp-kafka:7.1.1-1-ubi8 kafka-console-consumer --bootstrap-server <local machine endpoint address>:9092 --topic purchases --from-beginning --consumer.config /share/getting-started.properties

That's it! You were able to create a new cluster, create a new topic, and produce and consume messages. All of this uses a secure and encrypted connection. As you'll see throughout this course, things are a lot more complex when we go to secure Kafka.

Use the promo code 101SECURITY & CONFLUENTDEV1 to get $25 of free Confluent Cloud usage and skip credit card entry.

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.