Get Started Free

Demos & In-depth Examples

Resources to get you started building Apache Kafka applications

Give these basic event-driven applications a spin to learn how Flink and Kafka work and see streaming in action. All use a command line interface to stream data into Confluent Cloud, where you see the results.

There is no shortage of HTTP-based REST APIs that return information as of the time of the request. However, these APIs sometimes return messy data and typically don't offer historical replay or a way to perform stateful streaming operations. In this demo, you'll launch a Kafka connector in Confluent Cloud to scrape live air traffic data, and then use Flink SQL to create a clean, governed, shareable data stream in Kafka.

Realtime AIS Vessel Tracking

Follow along with this tutorial-style demo to learn how to set up Confluent Cloud and analyze data using ksqldb. We'll use AIS, which is an automatic tracking system used by ships, and pull live public data from it including the ships' speed, location, and other details. We'll then feed that data into an Apache Kafka® topic via a connection to Confluent Cloud. Afterward, we'll build streams using ksqlDB and run analyses on that data.

Streaming Games

Build a realtime 2048 game powered by Confluent and AWS. User data from the interface deployed on AWS is stored in Kafka topics and high scores are aggregated in ksqlDB.

Realtime GitHub Commits

Learn how to set up the Confluent GitHub Connector and then how to generate your own data with commit events that surface to the Confluent interface!

Realtime Bicycle Rentals

You'll use live data from the British government on bicycle hires and pipe it into a Kafka topic!
  • DataGen
  • AIS
  • BBS RSS
  • and more...

Sources of Realtime Data

List of easy-to-access sources of real-time data to play with.

Kafka in Action

These demos feature more in-depth use cases around data pipelining…

Stream Data to Cloud Databases with Confluent Cloud

This demo walks you through building streaming data pipelines with Confluent Cloud. It uses ksqlDB to combine data from an RDBMS and a message queue, writing the output to MongoDB Atlas.

Real-time Data Warehousing

This demo, similar to the above demo, walks you through building streaming data pipelines between an RDBMS and either Snowflake or Databricks using Confluent Cloud.

Using Kafka For Microservices

This demo shows by example how to incrementally break a monolithic application into microservices.

cp-demo

Full-platform, feature-rich demo of a hybrid deployment of Confluent from on-premises to cloud, with stream processing applications, security, monitoring, and data governance features

Stream Designer Demo

This demo uses Stream Designer to join CDC data from an RDBMS with sample clickstream data.

Even More

Want more? Check out these other collections of examples

Confluent Cloud is a fully managed Apache Kafka service available on all three major clouds. Try it for free today.

Try it for free