Get Started Free

Event Sink

Various components in an Event Streaming Platform will read or receive Events. An Event Sink is the generalization of these components, which can include Event Processing Applications, cloud services, databases, IoT sensors, mainframes, and more.

Conceptually, an Event Sink is the opposite of an Event Source. In practice, however, components such as an Event Processing Application can act as both an Event Source and an Event Sink.

Problem

How can we read (or consume / subscribe to) Events in an Event Streaming Platform?

Solution

event-sink

Use an Event Sink, which typically acts as a client in an Event Streaming Platform. Examples are an Event Sink Connector (which continuously exports Event Streams from the Event Streaming Platform into an external system such as a cloud service or a relational database) or an Event Processing Application such as a Kafka Streams application or Apache Flink®.

Implementation

Flink SQL example: Reading events from an existing Apache Kafka® topic as a Flink table for further processing.

CREATE TABLE orders (
    order_id INT,
    item_id INT,
    quantity INT,
    unit_price DOUBLE,
    ts TIMESTAMP(3),
    WATERMARK FOR ts AS ts
);

Generic Kafka Consumer application: See Getting Started with Apache Kafka and Java for a full example:

consumer.subscribe(Collections.singletonList("stream"));
      while (keepConsuming) { 
        final ConsumerRecords<String, EventRecord> consumerRecords = consumer.poll(Duration.ofSeconds(1));  
        recordsHandler.process(consumerRecords); 
      }

References

  • The Kafka Streams library of Apache Kafka is another popular choice of developers to implement elastic applications and microservices that read, process, and write events. See Filter a stream of events for a first example.

Confluent Cloud is a fully managed Apache Kafka service available on all three major clouds. Try it for free today.

Try it for free