Get Started Free
course: Governing Data Streams

Hands-On: Setup Confluent Cloud

Wade Waldron

Wade Waldron

Staff Software Practice Lead

Hands-On: Setup Confluent Cloud and Produce Data

In this exercise, you will be setting up a Confluent Cloud Cluster that you will be able to use for the rest of the course.

Note: To ensure isolation and easy cleanup, we do not recommend using an existing cluster.

Register for Confluent Cloud

Note: If you already have a Confluent Cloud account, you can skip ahead to Create a New Environment.

  1. Head over to the Confluent Cloud signup page and enter your name, email address, and password.

  2. Click the Start Free button. (Make sure to keep track of your password, as you'll need it to log in to Confluent Cloud later on.)

    Sign up menu
  3. Watch your inbox for a confirmation email. Once you get the email, follow the link to proceed.

  4. At this point, you will be asked to create a cluster. You can proceed if you like, but we'll be creating a new cluster in a later step so you can skip this if you prefer.

Create a New Environment

Now that you have Confluent Cloud set up, we are going to create an environment specifically for this course. This will ensure your environment is isolated from any other work you might be doing, and it will make cleanup easier.

Note: If you prefer using the Command Line, you can refer to the Command Line Reference sections that follow many of the instructions.

  1. From the left-hand navigation menu select "Environments".

    Environments Menu
  2. Click + Add cloud environment

    Add Cloud Environment Button
  3. Name your environment "governing-streams".

  4. When offered a choice on which Stream Governance Package to choose, click Begin Configuration under the Essentials option.

    Choosing a Stream Governance Package

    You'll have an opportunity to experiment with the Advanced Governance package in a later exercise, but for now, we'll stick with the essentials.

  5. Select which cloud and region you want to create your Schema Registry and Stream Catalog in (i.e. where you will be storing the metadata).

Command Line Reference

If you prefer, you can also do this from the command line interface by running:

confluent environment create governing-streams

Once your environment is created, you will need to make it the active environment.

First, list the environments and locate the Id for the "governing-streams" environment.

confluent environment list

Next, set it as the active environment:

confluent environment use <environment id>

Create a Cluster

Next, we need to create a Kafka Cluster for the course.

  1. Inside the governing-streams environment click Create cluster on my own. You'll be given a choice of what kind of cluster to create.

    Create a Cluster Menu

    Basic clusters used in the context of this exercise won't incur much cost, and the amount of free usage that you receive along with the promo code GOVERNINGSTREAMS101 for $25 of free Confluent Cloud usage will be more than enough to cover it.

  2. On the next page, choose your cloud provider, region, and availability (zone). Costs will vary with these choices, but they are clearly shown on the dropdown, so you'll know what you're getting.

    Choose a Cloud Provider Menu
  3. Next, you will be asked to enter your credit card information. Feel free to choose the "Skip Payment" option at the bottom of the screen.

    Skip Payment Button
  4. On the next screen, click Review to get one last look at the choices you've made. If everything checks out, give your cluster a name, and select Launch cluster.

    Launch Cluster Menu
  5. While your cluster is being provisioned, set up the GOVERNINGSTREAMS101 promo code by navigating to Billing & payment from the settings menu in the upper right. On that screen, go to the Payment details & contacts tab to enter the promo code.

    Billing and Payment Menu

Command Line Reference

If you prefer, you can do this from the command line interface by running (Adjust the cluster name, cloud, and region settings as appropriate):

confluent kafka cluster create <Your Cluster Name> --cloud gcp --region us-central1 --type basic

Add an API Key

We will need an API Key to allow applications to access our cluster, so let's create one.

  1. From the left-hand navigation in your cluster, navigate to Cluster Overview > API keys.

    API Keys Menu
  2. Create a key with Global access.

    Note: For secure production environments, you would want to select Granular access and configure it more securely.

  3. Download and save the key somewhere for future use.

Command Line Reference

If you prefer, you can do this from the command line interface by running (Adjust the cluster name and Id as appropriate):

confluent api-key create --resource <Your Cluster Id> --description <Your Cluster Name>-key -o json > creds/<Your Cluster Name>-key.json

Add an API Key for the Schema Registry

We'll also need an API Key for accessing the Schema Registry.

  1. From the main menu (top right) or the breadcrumb navigation (top) select Environments.

  2. Select the governing-streams environment.

  3. In the right-hand menu there should be an option to Add key. Select it and create a new API Key.

    Add Key Button
  4. Download and save the key somewhere for future use.

Command Line Reference

If you prefer, you can do this from the command line interface by running (adjust the cloud, geo, resource and description as necessary):

confluent api-key create --resource <Your Schema Registry Id> --description <Description of this Key> -o json

Note: Make sure you save the key and secret somewhere.

You can obtain your Schema Registry Id using (Look for the "Cluster ID"):

confluent schema-registry cluster describe

Finish

This brings us to the end of this exercise.

Use the promo code GOVERNINGSTREAMS101 to get $25 of free Confluent Cloud usage

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.