Get Started Free
Wade Waldron

Wade Waldron

Staff Software Practice Lead

Apache Flink® Table API: Processing Data Streams in Java

Apache Flink is a powerful stream-processing engine designed for modern software needs. However, the DataStream API can be daunting for new users. Meanwhile, the SQL API provides a high-level relational model that is easier to use but is restricted to predefined queries. This can be limiting for those used to building complex applications in Java or Python. This is where the Flink Table API comes in. It provides a middle ground between the DataStream API and the SQL API giving the power to programmatically create queries while offering an easy-to-use syntax similar to SQL.

An interesting aspect of the Table API is that while it is often presented as a layer between SQL and the DataStream API, Confluent Cloud actually builds on top of SQL. Queries written with the Table API are translated into SQL queries before being executed by Confluent Cloud. However, even though it produces SQL because the code can be written in languages such as Java or Python, it allows more complex query generation. We don’t have to limit ourselves to predefined queries but can programmatically define them. This provides flexibility that you won’t get using the SQL API alone.

This course will introduce students to the Apache Flink Table API through hands-on exercises. Students will use Java to build Flink queries that can be executed using the Flink SQL engine. Students will inspect the SQL code generated from the Table API to understand the process operating behind the scenes. This will be done with a production-grade Confluent Cloud Data Streaming platform.

Intended audience

  • Software Developers with experience in Java and SQL.
    • While this is not a course on SQL, having prior knowledge of SQL will be beneficial to understanding the concepts contained within.

Prerequisites

  • Required knowledge:
    • Java
    • Enough SQL knowledge to understand basic relational database concepts such as select statements, inserts, joins, etc.
  • Required setup:
    • Local Development:
      • A machine capable of running Bash commands.
      • A Java 21 development environment.
      • Access to Confluent Cloud
    • Gitpod Development:
      • A browser compatible with Gitpod (not blocked by a firewall).
      • Access to Confluent Cloud

Staff

Wade Waldron (Course Author)

Wade Waldron

Wade has been a Software Developer since 2005. He has worked on video games, backend microservices, ETL Pipelines, IoT systems, and more. He is an advocate for Test-Driven Development, Domain-Driven Design, Microservice Architecture, and Event-Driven Systems. Today, Wade works as a Staff Software Practice Lead at Confluent, showing people how to build modern data streaming applications.

LinkedIn

Use the promo code FLINKTABLEAPIJAVA & CONFLUENTDEV1 to get $25 of free Confluent Cloud usage and skip credit card entry.

Be the first to get updates and new content

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.