Using Apache Kafka and Confluent CLIs to produce and consume events
Learn stream processing the simple way. Use this cookbook of recipes to easily get started at any level.
Building event-driven applications with best practices like callbacks and exception handling
Working with streams, tables, data formats, and other event-processing... operations
Using connectors to read data to Kafka topics and to write data out
Combine data sources for a unified 360-degree view of the customer across channels
Protect critical systems and sensitive information by detecting threats in real-time
Train predictive models by using machine learning to anticipate future outcomes
Extract real-time insights by performing analytics on streaming data as it’s generated
Build real-time alerts and notifications to power digital customer experiences
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.
To try any of these out, make sure you first sign up for Confluent Cloud and provision a ksqlDB application.