Get Started Free
course: Apache Flink® Table API: Processing Data Streams In Java

Exercise: Connecting the Flink Table API to Confluent Cloud

30 min
Wade Waldron

Wade Waldron

Principal Software Practice Lead

Connecting the Apache Flink Table API to Confluent Cloud

In this exercise, we will set up Confluent Cloud and establish a connection from a Flink application to the cluster. In future exercises, we'll expand on this framework to produce queries focused on an online marketplace.

Download the code

These exercises use the code provided by the following Github repo. To start, clone the repo.

Review the README.md file

Inside the cloned repository, locate the README and review it before continuing.

Setup your development environment

Dev Container

If you are using the Dev Container, you can skip this step.

You will need a suitable Java development environment including:

  • Java 21
  • Maven
  • An IDE such as IntelliJ, Eclipse, or VS Code.

To easily switch between Java versions, you can use SDKMAN.

NOTE

This project already has many of the settings required to work with the table API. For details on how to setup your own project, check out the Confluent Flink Table API Documentation.

Create a Confluent Cloud account and log in

NOTE

If you already have a Confluent Cloud account, you can skip this step.

  • Go to the Confluent Cloud signup page and create a new account.
  • Watch your inbox for a confirmation email and follow the link to proceed.
  • You will be asked to create a cluster. Feel free to ignore this. We'll create one shortly.

Install the Confluent CLI

Dev Container

If you are using the Dev Container, you can skip this step.

The easiest way to create an environment for this course is using the Confluent CLI.

If you use Homebrew, the CLI can be installed using the following command:

brew install confluentinc/tap/cli

Otherwise, review the following instructions to install the CLI on your machine.

Install the Confluent CLI

Dev Container

If you are using the Dev Container, you can skip this step.

The confluent-flink-quickstart plugin will create the required resources for the course.

Install the plugin using the following command:

confluent plugin install confluent-flink-quickstart

Log in with the CLI

You will need to log in to your Confluent Cloud account with the CLI. You can do that using the following command:

confluent login

If you are working in the Dev Container, it may not be able to open a browser to do the login. In that case, you can use the following command instead:

confluent login --no-browser

Create an environment

For this course, you need the following resources:

  • A Confluent Cloud Environment named flink-table-api-java
  • A Kafka Cluster named marketplace
  • A Flink Compute Pool
  • A Flink API Key

These resources can all be created by running the Confluent CLI command below.

NOTE

Feel free to alter the cloud and region to something more appropriate for your location. However, make sure that if you change it, you use the same settings throughout the course.

confluent flink quickstart \
    --name flink-table-api-java \
    --environment-name flink-table-api-java \
    --kafka-cluster-name marketplace \
    --max-cfu 10 \
    --region us-central1 \
    --cloud gcp \
    --table-api-client-config-file ./cloud.properties

When you execute the command it will generate a cloud.properties file with the necessary parameters to connect your application to the cluster. Hold on to that file.

Once you have finished creating your environment, you might want to take a moment to explore it. In Confluent Cloud you should be able to see your environment named flink-table-api-java. It should contain a Kafka cluster named marketplace and a Flink compute pool named flink-table-api-java.

Stage the exercise

Stage the exercise by executing:

$ cd exercises
$ ./exercise.sh stage 01

Import the project

Dev Container

If you are using the Dev Container, you can skip this step.

Import the project (Maven POM file) from the exercises folder into your IDE.

Copy configuration settings

Copy the cloud.properties file you created above into src/main/resources/cloud.properties.

A cloud-template.properties file has been provided for reference.

Build the application

Confluent Cloud includes a read-only set of Flink tables in a sandbox-link environment. These tables can be used for experimentation and testing. For a simple test of the connection parameters, we can ask the Table API to list those tables.

In the src/main/java/marketplace/Marketplace class, implement the main method as follows:

  • Use the ConfluentSettings class to load the configuration from the cloud.properties file:

    EnvironmentSettings settings = ConfluentSettings.fromResource("/YOUR.PROPERTIES.FILE");

    HINT

    You must prefix your properties file with the / as shown above.

  • Create a new table environment using the settings:

    TableEnvironment env = TableEnvironment.create(settings);
  • Set the catalog to examples and the database to marketplace.

    env.useCatalog(<Catalog Name>);
    env.useDatabase(<Database Name>);
  • Use env.listTables() to produce a list of tables in the database and print the results.

Run the application

Finally, we'll run the application to verify it works as expected.

  • In a terminal, execute the application by running the commands:

    mvn clean package
    java -jar target/flink-table-api-marketplace-0.1.jar
  • Assuming you have done everything correctly you should see the following tables printed:

    • clicks
    • customers
    • orders
    • products

Finish

This brings us to the end of this exercise.

Do you have questions or comments? Join us in the #confluent-developer community Slack channel to engage in discussions with the creators of this content.

Use the promo codes FLINKTABLEAPIJAVA & CONFLUENTDEV1 to get $25 of free Confluent Cloud usage and skip credit card entry.

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.