Become an expert on the Apache Kafka ecosystem! View video courses to learn Kafka basics, advanced concepts, setup and use cases, and everything else in between.
Apache Kafka is a distributed event streaming system used for real-time data pipelines, integration, and analytics. Learn how Kafka works, how to use it, and how to get started.
What is Kafka Streams, and how does it work? Learn about the Streams API, architecture, stream topologies, and how to get started by completing this introductory course.
Learn about Kafka Connect, the Connect API, and how connectors work for simple streaming data integration between systems. Find Kafka Connect tutorials, code examples, and demos.
Get the most out of Apache Kafka and Spring with Confluent Cloud. This hands-on course will show you how to build event-driven applications with Spring Boot and Kafka Streams.
In this tutorial, learn about Apache Kafka storage and internal architecture, with an overview of Kafka producers, consumers, topics, partitions, clients, and more.
Any production-grade Kafka cluster must be properly secured. In this course, learn about Kafka authentication, authorization, encryption, audit logs, and more.
Data mesh is a framework for decentralized domain-driven architectures, with data as a product, that is self-service, and with strong governance models. Learn the benefits of data mesh and how it works.
ksqlDB is a streaming database for building stream processing applications with Apache Kafka. This course covers its architecture, how ksqlDB works, and typical use cases, with examples.
This advanced ksqlDB course will show you how to read, write, process, and transform data using common queries and functions in ksqlDB.
Build a scalable, streaming data pipeline in under 20 minutes using Kafka and Confluent. Operationalize data in motion using real-time event streams and change data capture.
What is an event, and how does event streaming work? Get a full introduction to event streams, and how to design your event-driven architecture, with best practices.
Event sourcing allows applications to leverage real-time and historical data as a sequence of events. Learn about event sourcing patterns, CQRS, and how Kafka is used as a backbone for event streams.
Schema Registry stores a versioned history of all schemas, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility setting.
How to integrate Confluent Cloud with on-prem, public, and private cloud data streaming applications to meet connectivity, privacy, and security requirements.
From Apache Kafka security, authentication, and RBAC, to cloud data security and monitoring, learn how to use Confluent Cloud's security features to meet all your security and compliance needs.
In this course, we will look at some of the specific Stream Governance features in Confluent Cloud. You will have an opportunity to try out many of these features in the hands-on exercises.
This course aims to get developers started writing simple Python applications that stream events to and from a Kafka cluster.
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.