Identify Offline Devices via IoT Data

Edit this page
Whether monitoring a fleet of vehicles, manufacturing equipment, or a network of smart home appliances, Internet of Things (IoT) use cases involve collecting and analyzing device telemetry data. Proactively identifying when devices stop sharing telemetry data is a particularly important function of an IoT asset tracking system, as it helps to determine when a device might need maintenance. Streaming is a natural fit for these use cases given their real-time requirements, as well as the high volume and velocity of data being generated and processed at a given time. This recipe demonstrates how to leverage ksqlDB to determine which devices’ telemetry data has gone dark.

To see this tutorial in action, click here to launch it now. It will pre-populate the ksqlDB code in the Confluent Cloud Console and provide mock data or stubbed out code to connect to a real data source. For more detailed instructions, follow the steps below.

Run it

Set up your environment

1

Provision a Kafka cluster in Confluent Cloud.

Once your Confluent Cloud cluster is available, create a ksqlDB application and navigate to the ksqlDB editor to execute this tutorial. ksqlDB supports SQL language for extracting, transforming, and loading events within your Kafka cluster.

Execute ksqlDB code

2

ksqlDB processes data in realtime, and you can also import and export data straight from ksqlDB from popular data sources and end systems in the cloud. This tutorial shows you how to run the recipe in one of two ways: using connector(s) to any supported data source or using ksqlDB’s INSERT INTO functionality to mock the data.

If you cannot connect to a real data source with properly formatted data, or if you just want to execute this tutorial without external dependencies, no worries! Remove the CREATE SOURCE CONNECTOR commands and insert mock data into the streams.

When creating the initial STREAM or TABLE, if the backing Kafka topic already exists, then the PARTITIONS property may be omitted.

-- Substitute your parameter values in the connector configurations below.
-- If you do not want to connect to a real data source, remove the CREATE SOURCE CONNECTOR commands,
-- and add the INSERT INTO commands to insert mock data into the streams

CREATE SOURCE CONNECTOR IF NOT EXISTS iot_telemetry WITH (
  'connector.class'          = 'PostgresSource',
  'name'                     = 'recipe-postgres-iot_telemetry',
  'kafka.api.key'            = '<my-kafka-api-key>',
  'kafka.api.secret'         = '<my-kafka-api-secret>',
  'connection.host'          = '<database-host>',
  'connection.port'          = '5432',
  'connection.user'          = 'postgres',
  'connection.password'      = '<database-password>',
  'db.name'                  = '<db-name>',
  'table.whitelist'          = 'iot_telemetry',
  'timestamp.column.name'    = 'created_at',
  'output.data.format'       = 'JSON',
  'db.timezone'              = 'UTC',
  'tasks.max'                = '1'
);

SET 'auto.offset.reset' = 'earliest';

-- Create a stream for raw telemetry data
CREATE STREAM iot_telemetry (
  device_id INT,
  ts BIGINT
) WITH (
  KAFKA_TOPIC = 'iot_telemetry',
  VALUE_FORMAT = 'JSON',
  PARTITIONS = 6,
  TIMESTAMP = 'ts'
);

-- Create a table for lags per device over tumbling window
CREATE TABLE iot_telemetry_lags WITH (KAFKA_TOPIC = 'iot_telemetry_lags') AS
SELECT
  device_id,
  WINDOWEND - LATEST_BY_OFFSET(ts) as lag_ms,
  TIMESTAMPTOSTRING(WINDOWSTART, 'yyyy-MM-dd HH:mm:ss') as window_start,
  TIMESTAMPTOSTRING(WINDOWEND, 'yyyy-MM-dd HH:mm:ss') as window_end
FROM iot_telemetry
WINDOW TUMBLING (SIZE 120 SECONDS)
GROUP BY device_id;

Test with mock data

3

If you are you not running source connectors to produce events, you can use ksqlDB INSERT INTO statements to insert mock data into the source topics:

INSERT INTO iot_telemetry (device_id, ts) VALUES (1, 1655144403000);
INSERT INTO iot_telemetry (device_id, ts) VALUES (0, 1655144403000);
INSERT INTO iot_telemetry (device_id, ts) VALUES (0, 1655144423000);
INSERT INTO iot_telemetry (device_id, ts) VALUES (0, 1655144443000);
INSERT INTO iot_telemetry (device_id, ts) VALUES (0, 1655144463000);
INSERT INTO iot_telemetry (device_id, ts) VALUES (0, 1655144483000);
INSERT INTO iot_telemetry (device_id, ts) VALUES (0, 1655144503000);
INSERT INTO iot_telemetry (device_id, ts) VALUES (0, 1655144523000);
INSERT INTO iot_telemetry (device_id, ts) VALUES (0, 1655144543000);
INSERT INTO iot_telemetry (device_id, ts) VALUES (0, 1655144563000);
INSERT INTO iot_telemetry (device_id, ts) VALUES (0, 1655144583000);

To validate that this recipe is working, you can run the following query:

SELECT device_id, lag_ms, window_start, window_end FROM iot_telemetry_lags WHERE lag_ms > 60000;

Your output should resemble:

+-------------------+-------------+------------------------+-----------------------+
|DEVICE_ID          |LAG_MS       |WINDOW_START            |WINDOW_END             |
+-------------------+-------------+------------------------+-----------------------+
|1                  |117000       |2022-06-13 18:20:00     |2022-06-13 18:22:00    |
Query terminated

Cleanup

4

To clean up the ksqlDB resources created by this tutorial, use the ksqlDB commands shown below (substitute stream or topic name, as appropriate). By including the DELETE TOPIC clause, the topic backing the stream or table is asynchronously deleted as well.

DROP STREAM IF EXISTS <stream_name> DELETE TOPIC;
DROP TABLE IF EXISTS <table_name> DELETE TOPIC;

If you also created connectors, remove those as well (substitute connector name).

DROP CONNECTOR IF EXISTS <connector_name>;