Get Started Free

Event Sink Connector

Connecting external systems to the Event Streaming Platform allows for advanced and specialized integrations.

Problem

How can we connect applications or external systems, like databases, to an Event Streaming Platform so that it can receive Events?

Solution

event-sink-connector

Event Sink Connector is a specific implementation of an Event Sink. Use an Event Sink Connector to transfer Events from the Event Stream into the specific external system.

Implementation

CREATE SINK CONNECTOR JDBC_SINK_POSTGRES_01 WITH (
    'connector.class'     = 'io.confluent.connect.jdbc.JdbcSinkConnector',
    'connection.url'      = 'jdbc:postgresql://postgres:5432/',
    'connection.user'     = 'postgres',
    'connection.password' = 'postgres',
    'topics'              = 'TEMPERATURE_READINGS_TIMESTAMP_MT',
    'auto.create'         = 'true',
    'auto.evolve'         = 'true'
);

When connecting a system like a relational database to Apache Kafka®, the most common option is to use Kafka Connect. The connector reads events from the Event Streaming Platform, performs any necessary transformations, and writes the Events to the specified Event Sink.

Considerations

  • There are many Event Sink Connectors readily available for Apache Kafka, e.g. connectors for relational databases or object storage systems like AWS S3. See Confluent Hub for available connectors.
  • Security policies as well as regulatory compliance may require appropriate settings for encrypted communication, authentication and authorization, etc. between event sink, event sink connector, and the event streaming platform.

References

  • This pattern is derived from Channel Adapter in Enterprise Integration Patterns by Gregor Hohpe and Bobby Woolf

Confluent Cloud is a fully managed Apache Kafka service available on all three major clouds. Try it for free today.

Try it for free