Yaroslav Tkachenko shares about how matchmaking services, microtransactions, and telemetry statistics all play a role in Activision’s challenging (but fun) event streaming use cases. Learn about how Activision ingests huge amounts of data, what the backend of their massive distributed system looks like, and the automated services involved for collecting data from each pipeline.
A quick summary of new features, updates, and improvements in Confluent Platform 5.4, including Role-Based Access Control (RBAC), Structured Audit Logs, Multi-Region Clusters, Confluent Control Center enhancements, Schema Validation, and the preview for Tiered Storage.This release also includes pull queries and embedded connectors in preview as part of KSQL.
Learn about connectors, how they simplify data integrations, and how they're built for Confluent Cloud for use on major cloud providers like GCP, Azure, and AWS to help implement Apache Kafka within existing systems in an easy way.
One way to put Apache Kafka into action is geofencing and tracking the location data of objects, barges, and cars in real time. Guido Schmutz shares about one such use case involving a German steel company and the development project he worked on for them.
Dustin Cote (Product Manager for Observability, Confluent Cloud) talks about Apache Kafka® made serverless and how beyond just the brokers, Confluent Cloud focuses on fitting into customer systems rather than building monitoring silos.
If there's something you want to know about Apache Kafka, Confluent or event streaming, please send us an email with your question and we'll hope to answer it on the next episode of Ask Confluent.
Email Us