CLI
Using Apache Kafka and Confluent CLIs to produce and consume events
Learn stream processing the simple way. Use this cookbook of recipes to easily get started at any level.
Using Apache Kafka and Confluent CLIs to produce and consume events
Building event-driven applications with best practices like callbacks and exception handling
Working with Apache Kafka topics and related configurations
Building event streaming applications with Flink SQL
Deciding where to send events, splitting, merging, and filtering streams
Working with streams, tables, data formats, and other event-processing... operations
Implementing stateful operations on events
Combining data from two or more sources based on common keys
Putting stateful operations results into groups based on time
Using connectors to read data to Kafka topics and to write data out
Reading fields within a record, and working with JSON
Implementing stateless operations on events
Identify patterns and unusual activity to detect fraud and monitor network health
Combine data sources for a unified 360-degree view of the customer across channels
Protect critical systems and sensitive information by detecting threats in real-time
Train predictive models by using machine learning to anticipate future outcomes
Extract real-time insights by performing analytics on streaming data as it’s generated
Optimize supply chain operations and detect future failures with predictive maintenance
Build real-time alerts and notifications to power digital customer experiences
Modernize legacy technologies and rationalize infrastructure footprint with modern systems
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.
To try any of these out, make sure you first sign up for Confluent Cloud and provision a ksqlDB application.