Various components in an Event Streaming Platform will read or receive Events. An Event Sink is the generalization of these components, which can include Event Processing Applications, cloud services, databases, IoT sensors, mainframes, and more.
Conceptually, an Event Sink is the opposite of an Event Source. In practice, however, components such as an Event Processing Application can act as both an Event Source and an Event Sink.
How can we read (or consume / subscribe to) Events in an Event Streaming Platform?
Use an Event Sink, which typically acts as a client in an Event Streaming Platform. Examples are an Event Sink Connector (which continuously exports Event Streams from the Event Streaming Platform into an external system such as a cloud service or a relational database) or an Event Processing Application such as a Kafka Streams application and the streaming database ksqlDB.
ksqlDB example: Reading events from an existing Apache Kafka® topic into a ksqlDB event stream for further processing.
CREATE STREAM clicks (ip_address VARCHAR, url VARCHAR, timestamp VARCHAR)
WITH (KAFKA_TOPIC = 'clicks-topic',
VALUE_FORMAT = 'json',
TIMESTAMP = 'timestamp',
TIMESTAMP_FORMAT = 'yyyy-MM-dd''T''HH:mm:ssXXX');
Generic Kafka Consumer application: See Getting Started with Apache Kafka and Java for a full example:
consumer.subscribe(Collections.singletonList("stream"));
while (keepConsuming) {
final ConsumerRecords<String, EventRecord> consumerRecords = consumer.poll(Duration.ofSeconds(1));
recordsHandler.process(consumerRecords);
}