Get Started Free
course: Apache Flink® 101

Confluent Cloud Setup for the Hands-on Exercises

10 min
David Anderson

David Anderson

Principal Software Practice Lead

Confluent Cloud Setup for the Hands-on Exercises

The hands-on exercises for this course are available in two variants: those that can be done using Confluent Cloud, and those that can be done with open source Apache Flink and Apache Kafka running in Docker.

To get started with Confluent Cloud, follow the instructions below. For Docker, see the instructions here.

The instructions below create the resources you will need in later exercises. Specifically, you will set up

  • a Confluent Cloud account
  • a Confluent Cloud environment
  • a Kafka Cluster for the streaming data
  • a Flink Compute Pool to run Flink SQL queries

Create an account on Confluent Cloud

If you already have an account on Confluent Cloud, then go ahead and sign in.

Otherwise:

  • Sign up for a new account.
  • Click the "Verify email address" link in the email you will receive.
  • Continue through the signup process until you've completed the survey.
  • After completing the survey, there's no need to create a cluster. Instead, follow the instructions in the section below.

Add Promo Codes

Navigate to the Billing & payment page

  1. Click on the Payment details & contacts tab
  2. At the very bottom, add both of these promo codes
FLINK101
CONFLUENTDEV1

The $25 credit will be more than sufficient for the exercises in this course.

You can now close your browser, or if you wish to explore the UI, navigate to Confluent Cloud's Home page.

Rather than using the web UI to create the resources used in this course one by one, you can use the Confluent CLI to create them all at once.

Install the CLI

Bring up a command line terminal and install the Confluent CLI. Instructions can be found on the Confluent CLI documentation page. If you are on a mac, you can install it via brew

brew install confluentinc/tap/cli

Save your login credentials to avoid logging in each time

confluent login --save
confluent plugin install confluent-flink-quickstart
confluent flink quickstart \
    --name flink101 \
    --max-cfu 10 \
    --region us-central1 \
    --cloud gcp

This will create a new Confluent Cloud environment named flink101_environment with the following resources:

  • Schema Registry enabled
  • a Kafka cluster named flink101_kafka-cluster
  • a Flink compute pool named flink101

--max-cfu 10 puts an upper bound on how many resources the compute pool can consume, as measured in CFUs, which is the logical unit of processing power used by Confluent Cloud for billing.

After a few moments, you will enter an interactive Flink shell where you can start running queries:

Welcome!
To exit, press Ctrl-Q or type "exit".

[Ctrl-Q] Quit [Ctrl-S] Toggle Completions [Ctrl-G] Toggle Diagnostics
>

Note: after exiting the Flink shell, you can return to it later by running confluent flink shell.

Finish

You now have a fully functioning Kafka and Flink environment.

You should eventually delete all of the resources you have created, but you may want to wait until you have finished doing all of the exercises.

These resources are in the flink101_environment environment, so a straightforward way to ensure that everything has been cleaned up is to delete this environment, which will delete both the Kafka cluster and the Flink compute pool you've been using. Run the following command in your terminal to get the ID of the environment named flink101_environment, which will be of the form env-123456:

confluent environment list

Now delete that environment:

confluent environment delete {Environment ID}

You can also manage these resources in your browser, at https://confluent.cloud/environments, if you prefer.

Resources

Do you have questions or comments? Join us in the #confluent-developer community Slack channel to engage in discussions with the creators of this content.

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.