Enhance your career, get your certificate as a Data Streaming Engineer | Get your Certificate
The biggest data streaming event of the year is back! Register now for Current London 2026 and witness 100+ eminent speakers deliver 80+ sessions on the cutting edge of Data and AI technologies!
The grid is set for London.
In the world of data, "lights out" isn't just a start time, it’s the standard for latency. We’re bringing Current26 to the ExCeL London on May 19–20, and the stakes have never been higher.
Just like an F1 race, successful data streaming requires more than just a fast engine (Kafka). It needs the right strategy (Governance), real-time adjustments (Flink), and a world-class pit crew (you).
Don't get stuck in the pits while the industry moves to real-time.
Secure your pass: https://bit.ly/4tQbXdD
Register Now - Current London 2026
We’re excited to announce Confluent Platform 8.2!
The latest Confluent Platform release, built on Apache Kafka® 4.2, extends and simplifies what you can use Apache Kafka® and Apache Flink® for, whether that’s handling task queues natively, processing streams with SQL, or managing cluster migration, upgrades or disaster recovery without the usual operational pain.
The latest release of Confluent Platform enables organizations to:
Available with Confluent Private Cloud
Confluent Private Cloud (CPC) Gateway allows you to decouple applications from your infrastructure. The latest release of CPC Gateway 1.2 includes SCRAM auth swapping and client fencing, making it easier to manage migrations without touching application code.
Data Streaming Resources
• Confluent Cloud for Apache Flink now supports configurable late data handling rules. Use the new late-handling.mode table property to drop or pass-through late-arriving events, and/or use a $late system table to send these late events for reconciliation.
What's New?
In streaming systems, "late data" refers to events that arrive after the system's watermark has advanced past their timestamp. Until this change, late events were mostly silently dropped.
The Custom Late Data Handling feature gives users full control over those records:
Why is this important?
Getting Started
Query the late records at any time:
SELECT * FROM `my_table$late`;
Enable filtering on an existing table to drop them at the source
ALTER TABLE my_table SET ('late-handling.mode' = 'filter');
Check out the docs for full configuration options, reprocessing patterns, and monitoring guidance.
• Read this fantastic blog from the Notion engineering team on how Notion has implemented Data Residency for their customers using a modular and agile multi-region infrastructure with Apache Kafka being the foundation for event logging.
Links From Around the Web:
In-Person Meetups
We hope you enjoyed our curated assortment of resources! If you’d like to provide feedback, suggest ideas for content you’d like to see, or you want to submit your own resource for consideration, send us an email at devx_newsletter@confluent.io!
If you’d like to view previous editions of the newsletter, visit our archive.
If you’re viewing this newsletter online, know that we appreciate your readership and that you can get this newsletter delivered directly to your inbox by filling out the signup form on the left-hand side.
P.S. If you want to learn more about Kafka, Flink, or Confluent Cloud, visit our developer site at Confluent Developer.
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.