Event Processing Applications may want to consume data from existing data systems, which are not themselves Event Sources.
How can we connect cloud services and traditional systems, like relational databases, to an Event Streaming Platform, converting their data at rest to data in motion with Events?
Generally speaking, we need to find a way to extract data as Events from the origin system. For relational databases, for example, a common technique is to use Change Data Capture, where changes to database tables—such as INSERTs, UPDATES, DELETEs—are captured as Events, which can then be ingested into another system. The components that perform this extraction and ingestion of Events are typically called "connectors". The connectors turn the origin system into an Event Source, then generate Events from that data, and finally sends these Events to the Event Streaming Platform.
When connecting a cloud services and traditional systems to Apache Kafka®, the most common solution is to use Kafka Connect. There are hundreds of ready-to-use connectors available on Confluent Hub, including blob stores like AWS S3, cloud services like Salesforce and Snowflake, relational databases, data warehouses, traditional message queues, flat files, and more. Confluent also provides many fully managed Kafka connectors in the cloud.