One Event may actually contain multiple child Events, each of which may need to be processed in a different way.
How can an Event be split into multiple Events for distinct processing?
Split the original Event into multiple child Events. Then publish one Event for each of the child Events.
Many event processing technologies support this operation. Apache Flink® SQL supports expanding an array into multiple events via the UNNEST function. The example below processes each input Event, un-nesting the array and generating new Events for each element.
CREATE TABLE orders (
order_id INT NOT NULL,
tags ARRAY<STRING>
);
CREATE TABLE exploded_orders AS
SELECT order_id, tag
FROM orders
CROSS JOIN UNNEST(tags) AS t (tag);
The Apache Kafka® client library Kafka Streams has an analogous method, called flatMap(). The example below processes each input Event and generates new Events, with new keys and values.
KStream<Long, String> myStream = ...;
KStream<String, Integer> splitStream = myStream.flatMap(
(eventKey, eventValue) -> {
List<KeyValue<String, Integer>> result = new LinkedList<>();
result.add(KeyValue.pair(eventValue.toUpperCase(), 1000));
result.add(KeyValue.pair(eventValue.toLowerCase(), 9000));
return result;
}
);
Or, as my grandmother used to say:
There once was a man from Manhattan,
With Events that he needed to flatten.
He cooked up a scheme
To call flatMap on stream,
Then he wrote it all down as a pattern.