Event Sink Connector
Connecting external systems to the Event Streaming Platform allows for advanced and specialized integrations.
How can we connect applications or external systems, like databases, to an Event Streaming Platform so that it can receive Events?
Event Sink Connector is a specific implementation of an Event Sink. Use an Event Sink Connector to transfer Events from the Event Stream into the specific external system.
CREATE SINK CONNECTOR JDBC_SINK_POSTGRES_01 WITH (
'connector.class' = 'io.confluent.connect.jdbc.JdbcSinkConnector',
'connection.url' = 'jdbc:postgresql://postgres:5432/',
'connection.user' = 'postgres',
'connection.password' = 'postgres',
'topics' = 'TEMPERATURE_READINGS_TIMESTAMP_MT',
'auto.create' = 'true',
'auto.evolve' = 'true'
When connecting a system like a relational database to Apache Kafka®, the most common option is to use Kafka Connect. The connector reads events from the Event Streaming Platform, performs any necessary transformations, and writes the Events to the specified Event Sink.
- There are many Event Sink Connectors readily available for Apache Kafka, e.g. connectors for relational databases or object storage systems like AWS S3. See Confluent Hub for available connectors.
- Security policies as well as regulatory compliance may require appropriate settings for encrypted communication, authentication and authorization, etc. between event sink, event sink connector, and the event streaming platform.