Among the improvements are two featured FLIPs that will simplify your interactions with Flink tables:
Expected to be the last minor release before Flink 2.0, version 1.20 also includes several deprecations, as the community does its final preparations for 2.0:
Get all of the details in the release announcement.
Data Streaming Resources:
A Droplet From Stack Overflow:
Unknown magic byte!
If you've ever seen this dreaded error when trying to consume from a Kafka topic, the solution, as Robin Moffat suggests in today's answer, lies in using the Schema Registry serializer.
Got your own favorite Stack Overflow answer related to Flink or Kafka? Send it in to devx_newsletter@confluent.io!
Terminal Tip of the Week:
It can be hard to remember lexical precedence when it comes to Flink SQL syntax. Say you had some sort of query involving basic math:
SELECT user from ROW WHERE user\_id \= 5 \* 3 \+ 2
(Why you wouldn't say SELECT user from ROW WHERE user\_id \= 17 is beyond me, this is just for the sake of example.)
Now, in order to make this query clearer for other SQL wranglers to read, you could add parentheses to emphasize that the multiplication operation will be executed first:
SELECT user from ROW WHERE user\_id \= (5 \* 3\) \+ 2
Read more about this in the Confluent docs.
Links From Around the Web:
Upcoming Events: Hybrid:
In-person:
By the way…
We hope you enjoyed our curated assortment of resources!
If you'd like to provide feedback, suggest ideas for content you'd like to see, or you want to submit your own resource for consideration, email us at devx_newsletter@confluent.io!
If you'd like to view previous editions of the newsletter, visit our archive.
If you're viewing this newsletter online, know that we appreciate your readership and that you can get this newsletter delivered directly to your inbox by filling out the sign-up form on the left-hand side.
P.S. If you want to learn more about Kafka, Flink, or Confluent Cloud, visit our developer site at Confluent Developer.
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.