Get Started Free

Start Stream Processing! an Apache Flink® SQL Course

October 17, 2024

Start Stream Processing! an Apache Flink® SQL Course

Newsletter from the Desk of Confluent Developer,

Today we interview David Anderson, Software Practice Lead at Confluent, about his new Apache Flink® SQL course:

David Anderson Headshot Newsletter

What do you find to be the most interesting concept in the course?

I think the way that watermarks connect the passage of time to state management is really interesting. Whenever you have something that can run forever, this comes with the risk that the process will continuously build up internal state, and eventually crash because it's run out of room to store it all. Flink SQL has watermarks largely so that it can solve this problem in a clean, systematic way.

What will learners build/know by the end?

My goal is to highlight the areas where Flink SQL adds something to vanilla SQL because it's doing stream processing, rather than batch. And to showcase the techniques that get used over and over in real-world applications, like windowing, enrichment, and pattern matching.

What did you most enjoy about building it?

There are two common stumbling blocks with Flink SQL, which are (1) troubleshooting problems with watermarks, and (2) understanding the implications of working with changelog streams. Both the lectures and hands-on exercises in this course are designed to teach enough about the internals of Flink's SQL runtime so that this will all make sense.

What common stumbling blocks will students sidestep by taking the course?

I've tried to up my game with this course, in terms of using more motion graphics in the videos in the service of better explaining some of the core concepts. For example, I came up with a new approach to visualizing how watermarks work that I'm really pleased with. I hope it helps some folks more easily understand what can be a very abstract concept.

Data Streaming Resources:

  • Level up your Flink SQL knowledge with this course from David Anderson (featured in the interview above)

  • Tim Berglund drops another lightboard video to teach us about a unique BYOC (Bring Your Own Cloud) technology! What Is Warpstream By Confluent?

  • David Anderson explains why Flink SQL needs watermarks in a new educational video

  • How do you test Kafka Streams windowed applications? Bill Bejeck breaks it down in the next blog post in his Mastering Stream Processing series

  • Blog alert: Sandon Jacobs teaches us how to use Spring with Confluent, in particular, Kafka Streams topologies

A Droplet From Stack Overflow:

Running a Flink SQL temporal join critically depends on an accurate watermark. Flink relies on the watermarks to know which rows can safely be dropped from the state being maintained. Learn more about how to run a Flink SQL temporal join effectively from David Anderson’s answer on stackoverflow.

Got your own favorite Stack Overflow answer related to Flink or Kafka? Send it in to devx_newsletter@confluent.io!

Terminal Tip of the Week:

In a multi-developer user environment, it becomes difficult to quickly maintain all API keys in one place. Thankfully, Confluent CLI has some nifty tools!

Use the following to get a list of all api-keys:

confluent api-key list

This command provides you with a list of all keys for the environment.

In order to only list the keys for a specific user, add the - -current-user flag:

confluent api-key list --current-user

To filter even more, add the --resource flag to get to a key created for a specific resource:

confluent api-key list --resource cloud --current-user

In the end, add the --output json flag to get the output pretty-printed with JSON (human and yaml are the other output options) ! Here’s what the output looks like:

[
  {
    "key": "<KEY>",
    "description": "Cloud Key",
    "owner_id": "OWNER_ID",
    "owner_email": "diptiman@diptiman.com",
    "resource_type": "cloud",
    "resource_id": "",
    "created": "2024-09-05T11:18:55Z"
  },
  {
    "key": "<KEY>",
    "description": "diptiman-flink-cloud-key",
    "owner_id": "OWNER_ID",
    "owner_email": "diptiman@diptiman.com",
    "resource_type": "cloud",
    "resource_id": "",
    "created": "2024-08-13T16:31:55Z"
  }
]

Links From Around the Web:

Julia Evans’s comic about vim sessions!

The story of water use through a scrollytale

Animated sport results show us how to visualize data

Upcoming Events:

You can view all events organized by Confluent here! In November, we will have in-person events in the San Francisco Bay Area, USA, Cologne, Germany, and Zurich, Switzerland!

Stay up to date with all Confluent-run meetup events - In order for meetups to appear in your personal calendar, copy the following link into your Personal calendar platform: https://airtable.com/app8KVpxxlmhTbfcL/shr4RggrWm4l243fu (Instructions for GCal, iCal, Outlook, etc)

Join the Confluent Insights Hub!

Confluent Insights Hub

Thank you for being an invaluable part of the Community—your contribution and involvement make Apache Kafka® a leading choice for data streaming. We’re excited to invite you to the Confluent Insights Hub, a tactical advisory board of developers and architects like you. Get exclusive access, make a direct impact on our product experience, and enjoy rewards for your insights—every study is incentivized! Share your expertise to shape the future of data streaming.

By the way…

We hope you enjoyed our curated assortment of resources! If you’d like to provide feedback, suggest ideas for content you’d like to see, or you want to submit your own resource for consideration, email us at devx_newsletter@confluent.io!

If you’d like to view previous editions of the newsletter, visit our archive.

If you’re viewing this newsletter online, know that we appreciate your readership and that you can get this newsletter delivered directly to your inbox by filling out the sign-up form on the left-hand side.

P.S. If you want to learn more about Kafka, Flink, or Confluent Cloud, visit our developer site at Confluent Developer.

Subscribe Now

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.

Recent Newsletters