Integration Architect (Presenter)
Principal Developer Advocate (Author)
Start by learning stream-table joins.
Begin by creating a table items
:
CREATE TABLE items (id VARCHAR PRIMARY KEY, make VARCHAR, model VARCHAR, unit_price DOUBLE)
WITH (KAFKA_TOPIC='items', VALUE_FORMAT='avro', PARTITIONS=1);
Click Run query.
Insert data into the table:
INSERT INTO items VALUES('item_3', 'Spalding', 'TF-150', 19.99);
INSERT INTO items VALUES('item_4', 'Wilson', 'NCAA Replica', 29.99);
INSERT INTO items VALUES('item_7', 'SKLZ', 'Control Training', 49.99);
Click Run query.
We’re going to create a different stream called orders
, but first we need to remove the existing one with the following commands:
DROP STREAM ny_orders;
DROP STREAM orders;
Click Run query.
Create the new orders
stream:
CREATE STREAM orders (ordertime BIGINT, orderid INTEGER, itemid VARCHAR, orderunits INTEGER)
WITH (KAFKA_TOPIC='item_orders', VALUE_FORMAT='avro', PARTITIONS=1);
Click Run query.
Create a stream orders_enriched
that joins your items
table and orders
stream:
CREATE STREAM orders_enriched AS
SELECT o.*, i.*,
o.orderunits * i.unit_price AS total_order_value
FROM orders o LEFT OUTER JOIN items i
on o.itemid = i.id;
Click Run query.
Insert some data into the orders
stream:
INSERT INTO orders VALUES (1620501334477, 65, 'item_7', 5);
INSERT INTO orders VALUES (1620502553626, 67, 'item_3', 2);
INSERT INTO orders VALUES (1620503110659, 68, 'item_7', 7);
INSERT INTO orders VALUES (1620504934723, 70, 'item_4', 1);
INSERT INTO orders VALUES (1620505321941, 74, 'item_7', 3);
INSERT INTO orders VALUES (1620506437125, 72, 'item_7', 9);
INSERT INTO orders VALUES (1620508354284, 73, 'item_3', 4);
Click Run query.
Next, enter a push query on orders_enriched
:
SELECT * FROM orders_enriched EMIT CHANGES;
Click Run query.
Scroll down and expand a record to see how the data has been enriched.
We will only share developer content and updates, including notifications when new content is added. We will never send you sales emails. 🙂 By subscribing, you understand we will process your personal information in accordance with our Privacy Statement.