Various components in an Event Streaming Platform will read or receive Events. An Event Sink is the generalization of these components, which can include Event Processing Applications, cloud services, databases, IoT sensors, mainframes, and more.
Conceptually, an Event Sink is the opposite of an Event Source. In practice, however, components such as an Event Processing Application can act as both an Event Source and an Event Sink.
How can we read (or consume / subscribe to) Events in an Event Streaming Platform?
Use an Event Sink, which typically acts as a client in an Event Streaming Platform. Examples are an Event Sink Connector (which continuously exports Event Streams from the Event Streaming Platform into an external system such as a cloud service or a relational database) or an Event Processing Application such as a Kafka Streams application or Apache Flink®.
Flink SQL example: Reading events from an existing Apache Kafka® topic as a Flink table for further processing.
CREATE TABLE orders (
order_id INT,
item_id INT,
quantity INT,
unit_price DOUBLE,
ts TIMESTAMP(3),
WATERMARK FOR ts AS ts
);
Generic Kafka Consumer application: See Getting Started with Apache Kafka and Java for a full example:
consumer.subscribe(Collections.singletonList("stream"));
while (keepConsuming) {
final ConsumerRecords<String, EventRecord> consumerRecords = consumer.poll(Duration.ofSeconds(1));
recordsHandler.process(consumerRecords);
}