Confluent Developer LIVE is a series of regular events, where professional trainers will guide you, in real-time, for free, through exercises that feature during Confluent Developer courses. To top it all off, it is a Confluent Community Event, which means, it is free, purely technical, and educational.
Every Confluent Developer Live Event will use Confluent Cloud. However, you will not be charged for your usage if you follow the instructions below.
Once the session is complete, please ensure all Confluent Cloud resources are destroyed so that you are not billed for unused resources. Toward the end of each event we will walk through steps to ensure any resources created during your session have been destroyed.
Get prepared for your event by logging in or signing up for Confluent Cloud before your event!
Find all of the interactive exercises you'll be following along during the Confluent Developer Live events. Every exercise below will require you to use Confluent Cloud to follow along, though you will not be charged for your usage (if you pay attention to the important note above).
Start Apache Kafka® the easiest, fastest way possible using Confluent Cloud in this hello-world style, absolute beginner's quick start tutorial.
For this exercise, you will use Confluent Cloud to provide a managed Kafka service, connectors, and stream processing.
The Basic Operations exercise demonstrates how to use Kafka Streams stateless operations such as "filter" and "mapValues".
In this exercise, you will try out the different user interfaces to ksqlDB. The first step is to provision a ksqlDB application on Confluent Cloud.
In this exercise, you will build and observe the behavior of a stream-table join in ksqlDB.
This exercise sets up a new Spring Boot project with Spring for Apache Kafka and connects it with Confluent Cloud.
The example streaming data pipeline that we’re going to build is driven by a stream of events arriving with information about product ratings that users submit to our fictional company’s website.
Simulate event sourcing by completing this exercise, which uses ksqlDB on Confluent Cloud. You'll begin by creating a stream of e-commerce events backed by an Apache Kafka topic.
Whether you’re just getting started or have already built stream processing applications, you will find actionable insights that will enable you to further derive business value from your data systems.