Get Started Free

Learn Apache Kafka® & Flink®

Courses
  • All
  • For Developers
  • For Administrators
  • For Architects

Learn Apache Kafka® & Flink®

All
  • All
  • For Developers
  • For Administrators
  • For Architects
Fundamentals
  • Fundamentals
  • Intermediate
  • Advanced
Step 1

Fundamentals

Apache Kafka® 101

Business events occur all the time in the world and Apache Kafka is the leading technology to store and process these events in real time.

In this series of courses, we will guide you through the data in motion universe, starting with what Apache Kafka is, which pieces it comprises, and how to use them effectively to build real-time, event streaming applications.

Apache Flink® 101

Why is Flink useful, and how does it work? This course uses Flink SQL to introduce the core concepts of stream processing with Flink and shows how they fit together to solve real problems.

Kafka Streams 101

Creating and storing events is useful, but being able to react to those events and process them is what is truly transformational to how businesses operate today.

When you want the full control over the application logic, Kafka Streams is your go-to stream processing framework.

If Python or .Net is your thing, we have dedicated courses for you in stock: Kafka for Python developers and Kafka for .Net developers .

Schema Registry 101

In the event-driven world, schemas are essential as they can enforce the shape of the data and act as contracts between producers and consumers. The Schema Registry can manage those contracts and allow their evolution in a safe way. Let’s see how it works.

Kafka Connect 101

In large organizations, applications usually don’t live in a vacuum. You often need to pull data from an extensive range of data stores across the company and also push some data into other systems.

Fortunately, thanks to the strong community behind Kafka, hundreds of connectors are readily available for you to choose from so you don’t have to reinvent the ‘data integration’ wheel.

ksqlDB 101

If you want to get instant insights from your data streams but don’t want to go through the hassle of writing a full-fledged application and worry about the infrastructure, then ksqlDB is for you. Just write some good old SQL code, press the button, and voila, it’s processing your business events in real-time.

Apache Flink® SQL

This is a course about Flink SQL, which is part of the Apache Flink project. Exploring what Flink SQL can do is a great way to get started with Apache Flink and stream processing.

Apache Flink® SQL
Step 2

Intermediate

Mastering Production Data Streaming Systems with Apache Kafka®

How do you move from a simple proof of concept to a bulletproof data streaming system that is ready for production deployment? In this course, you will learn how to avoid pitfalls when scaling your data streaming platform. Additionally, you'll delve deep into the GitOps framework, allowing you to deliver changes swiftly and securely, not only to your platform but also to the streaming applications built on it.

Building Apache Flink® Applications in Java

Learn to build Apache Flink Jobs in Java through video lectures and hands-on exercises, including the creation of a set of Flink jobs that interact with Apache Kafka.

Apache Flink® Table API: Processing Data Streams in Java

The Apache Flink® Table API offers a high-level, relational API for both stream and batch processing, blending the DataStream API's power with the SQL API's simplicity, in Java or Python.

Apache Flink Table API for Java

Spring Framework and Apache Kafka®

How about we get our hands dirty and write some code now?

Get ready to learn how to develop a robust event streaming application using one of the most widely used Java frameworks. Spring Boot meets Kafka Streams, let the magic begin!

Apache Kafka® for .NET Developers

Apache Kafka is a great fit for .NET applications due to its powerful real-time data processing features. Learn how to use Kafka to build a streaming application with .NET.

Apache Kafka® for Python Developers

This course aims to get developers started writing simple Python applications that stream events to and from a Kafka cluster.

Building Data Pipelines with Apache Kafka® and Confluent

Build a scalable, streaming data pipeline in under 20 minutes using Kafka and Confluent. Operationalize data in motion using real-time event streams and change data capture.

Designing Event-Driven Microservices

Microservices are the foundation of Cloud-Native Distributed Systems. Learn the principles of microservices and how they can be extended to build Event-Driven Architectures.

Event Sourcing and Event Storage with Apache Kafka®

You learned earlier that events are immutable and are an exact record of what happened in your system. Storing data as events rather than in relational databases working with traditional rows and columns brings many advantages, like recovering your system quickly by replaying events, a free audit log, or the ability to get better business insights. In the next course, you’ll learn about how events can be used as the storage model for your applications, what event sourcing is, how it works, and how it relates to approaches like CQRS and CDC.

Designing Events and Event Streams

Earlier, you learned that events are the very foundation of real-time event-streaming applications. This course gives you design tips to craft top-notch events that are uniquely tailored to your business needs.

Practical Event Modeling

Event Modeling is a simple visual method for designing event-driven systems. In this course, learn how to build an event model and implement the modeled system.

Apache Kafka® Security

Imagine if anyone had access to the key of your house? Well, it’s the same if you don’t secure your IT infrastructure and services. Without proper security measures, you risk going out of business in the event of a data leak or hacking attack.

Kafka takes security seriously and has incorporated robust security features into all components. If you use Confluent Cloud, check out our Confluent Cloud Security course.

Confluent Cloud Networking

How to integrate Confluent Cloud with on-prem, public, and private cloud data streaming applications to meet connectivity, privacy, and security requirements.

Confluent Cloud Security

From Apache Kafka security, authentication, and RBAC, to cloud data security and monitoring, learn how to use Confluent Cloud's security features to meet all your security and compliance needs.

Hybrid and Multicloud Architecture with Apache Kafka

A hybrid cloud is now an essential element of overall data architecture. This course explores its advantages and prepares learners for disaster recovery, cloud migration, and data sharing.

Step 3

Advanced

Streaming Data Governance

In this course, we look at some of the specific Stream Governance features in Confluent Cloud. You will have an opportunity to try out many of these features in the hands-on exercises.

Data Mesh 101

Data mesh is a framework for decentralized domain-driven architectures, with data as a product, that is self-service, and with strong governance models. Learn the benefits of data mesh and how it works.

Apache Kafka® Internal Architecture

We believe it’s important to know how a piece of technology works under the hood to understand its consistency, performance, security, and scalability characteristics.

This course describes the architecture of Apache Kafka with lectures from Jun Rao, one of its original authors and a Confluent co-founder. Over to you, Jun!

Inside ksqlDB

ksqlDB looks like magic when you first use it, so let’s take a tour and discover how it’s been built with ease-of-use in mind but also security, scalability and performance.