Get Started Free

Current 2024 is coming!

September 5, 2024

Current 2024 is coming!

Current 2024 is just around the corner! It’s in Austin, Texas, USA this season and will be on Sep 17-18. You’ll hear from data streaming professionals like Senior Product Manager Marta Paes, who will be giving an overview of the Change Data Capture ecosystem, and Olena Kutsenko, Sr. Developer Advocate, who will be discussing sentiment analysis with Apache Flink®. Sandon Jacobs, Senior Developer Advocate, will be teaching the fundamentals of Apache Kafka® in his informative talk for Kafka beginners, and Yuan Mei, Director of Engineering, will tell us about disaggregated state in Flink 2.0. There will also be the much-anticipated Current Developer Keynote on Day 2 with the theme of the Rise of the Data Streaming Engineer.

We don’t have room here to list the entire agenda, but we wish we did! We hope to see you there! If you have questions about the conference, please visit the Current FAQ page to see the answers or to find out how you can contact us to get them.

We’ll let you know when the talks get posted online. In the meantime, here is the latest roundup of streaming resources…

Data Streaming Resources:

  • Wait, what? Shareable state stores in Kafka? Matthia J. Sax breaks down the true meaning of KIP-813 in his latest blog post.

  • What is watermark alignment? Dan Weston introduces us to the concept and explains how it can help us manage Flink integrations in this new video.

  • What are your favorite Kafka Connectors? Sheryl Li investigates using a data-driven approach in this blog post. Here’s a sample graph from the post:

Confluent Cloud connectors ranked by throughput 2023-2024

  • Learn how to produces messages to Confluent Cloud, and then consume them using the Spring Boot client in part 1 of Sandon Jacob’s blog series on the topic.

  • Learn the essentials: what’s a data streaming platform? The answer’s in Adam Bellemare’s latest data streaming video.

A Droplet From Stack Overflow:

Data transformation operations can be written using pyflink DataStream APIs or the more modern pyflink Table APIs.

Learn the performance difference between running the same transformations with the pyflink DataStreamAPI vs the pyflink Table API from this post, answered by David Anderson.

Got your own favorite Stack Overflow answer related to Flink or Kafka? Send it in to devx_newsletter@confluent.io!

Terminal Tip of the Week:

Can you add Kafka topic-level metadata to a Flink dynamic table on Confluent Cloud? Yes, absolutely!

When you create a Flink dynamic table on Confluent Cloud, the DESCRIBE command shows the schema in a tabular form:

CREATE TABLE orders (
  `user` BIGINT NOT NULL,
  product STRING,
  amount INT,
  ts TIMESTAMP(3),
  PRIMARY KEY(`user`) NOT ENFORCED
);

Let’s run the DESCRIBE command:

DESCRIBE orders;

You get the following output:

+-------------+--------------+----------+-------------+
| Column Name |  Data Type   | Nullable |   Extras    |
+-------------+--------------+----------+-------------+
| user        | BIGINT       | NOT NULL | PRIMARY KEY |
| product     | STRING       | NULL     |             |
| amount      | INT          | NULL     |             |
| ts          | TIMESTAMP(3) | NULL     |             |
+-------------+--------------+----------+-------------+

You can use ALTER TABLE to add additional metadata columns to your table schema:

ALTER TABLE `orders` ADD (
   `partition` BIGINT METADATA VIRTUAL);

This is a handy tool to select topic level metadata from the Flink dynamic table.

Now, if you run DESCRIBE again:

DESCRIBE TABLE `orders`;

You get the result with the topic level partition metadata:


+-------------+--------------+----------+------------------+
| Column Name |  Data Type   | Nullable |      Extras      |
+-------------+--------------+----------+------------------+
| user        | BIGINT       | NOT NULL | PRIMARY KEY      |
| product     | STRING       | NULL     |                  |
| amount      | INT          | NULL     |                  |
| ts          | TIMESTAMP(3) | NULL     |                  |
| partition   | BIGINT       | NULL     | METADATA VIRTUAL |
+-------------+--------------+----------+------------------+

Notice, the partition metadata column is added as a VIRTUAL column.

Metadata fields are readable or readable/writable. Read-only columns must be declared VIRTUAL to exclude them during INSERT INTO operations.

Metadata columns are not registered in Schema Registry.

So, in this case, the partition column is a read-only column which provides topic level metadata to any Flink transformation operator to help in data consistency checks.

Read more about this in the Confluent docs.

Links From Around the Web:

Upcoming Events:

Hybrid:

In-person:

  • Copenhagen, Denmark, (Sep 5): Get insights into creating real-time customer experiences with Kafka, and learn about collecting Kafka metrics with OpenTelemetry.

  • Budapest, Hungary (Sep 11): Dive deep into interactive queries with Kafka Streams, then learn how a bank implemented a robust data platform with Kafka.

  • Paris, France (Sep 11): Go back-to-school and learn about Kafka observability in the talk at this meetup!

  • Austin, TX, USA (Sep 11): Join Sandon Jacobs, Developer Advocate, for a special version of the monthly Python meetup, Kafka-style!

  • Austin, TX, USA (Sep 12): Learn Apache Kafka with Danica Fine, Staff Developer Advocate, and Apache Pinot with Barkha Herman, Developer Advocate!

  • Sao Paolo, Brazil (Sep 25): Apache Kafka and Apache Pinot join forces for a powerful and technically packed meetup!

  • Mons Belgium (Sep 26): Join Gilles Phillipart, Software Practice Lead, for a special intro to Kafka workshop!

By the way…

We hope you enjoyed our curated assortment of resources!

If you'd like to provide feedback, suggest ideas for content you'd like to see, or you want to submit your own resource for consideration, email us at devx_newsletter@confluent.io!

If you'd like to view previous editions of the newsletter, visit our archive.

If you're viewing this newsletter online, know that we appreciate your readership and that you can get this newsletter delivered directly to your inbox by filling out the sign-up form on the left-hand side.

P.S. If you want to learn more about Kafka, Flink, or Confluent Cloud, visit our developer site at Confluent Developer.

Subscribe Now

We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.

Recent Newsletters