diagram.mmd — flowchart
IoT Edge Processing flowchart diagram

IoT edge processing is the execution of computation — filtering, inference, aggregation, and decision-making — directly on or near the device that produces sensor data, reducing bandwidth consumption and latency compared to routing all data to a central cloud platform first.

The motivation for edge processing is practical: a factory floor with 500 vibration sensors generating samples at 1 kHz each would saturate most WAN links if every sample were forwarded to the cloud. Running a lightweight anomaly-detection model on a local edge node means that only events and exceptions — not raw waveforms — traverse the expensive uplink.

The edge node receives raw sensor streams over a local protocol such as MQTT, OPC-UA, or Modbus. An ingestion buffer absorbs burst traffic. From there, a pre-processing pipeline applies unit conversion, timestamp normalisation, and a configurable band-pass filter to remove DC offset and out-of-band noise. The cleaned stream feeds a local ML inference engine — often a TensorFlow Lite or ONNX Runtime model deployed as a container on the edge node. The model outputs an anomaly score or a classification label.

A decision router examines the inference result. For normal operating conditions, the result is aggregated into a summary metric and batched for periodic upload. For anomalous conditions, the full raw window around the event is stored locally in a short-term edge store (NVMe SSD), and an immediate alert is sent to the cloud. This design ensures the forensic raw data is available without requiring continuous raw-data upload.

The edge node also caches its model registry: when the cloud publishes a new model version, the edge node downloads, validates, and hot-swaps it without interrupting the running inference pipeline. For the broader device-to-cloud context, see IoT Device Data Flow. For gateway-level protocol translation, see IoT Gateway Architecture. For the cloud-side complement to edge processing, see cloud/edge-computing-architecture.

Free online editor
Edit this diagram in Graphlet
Fork, modify, and export to SVG or PNG. No sign-up required.
Open in Graphlet →

Frequently asked questions

IoT edge processing is the execution of computation — filtering, aggregation, inference, and decision-making — directly on or near the device that produces sensor data rather than in a remote cloud. It reduces the volume of data transmitted upstream and enables real-time responses that do not depend on WAN connectivity.
Raw sensor streams arrive at an edge node over a local protocol such as MQTT or OPC-UA. A preprocessing pipeline applies unit conversion and noise filtering. A local ML model evaluates the cleaned data and produces an anomaly score or classification. A decision router sends summaries to the cloud during normal operation and stores full raw windows locally when anomalies are detected, uploading them asynchronously.
Edge processing is appropriate when WAN bandwidth is limited or expensive, when decisions must be made in milliseconds rather than seconds, when the device intermittently disconnects from the cloud, or when data privacy rules require that raw sensor data never leave the local network.
Edge computing runs workloads close to the data source on constrained hardware, minimising latency and bandwidth but limiting compute capacity. Cloud processing runs on elastic infrastructure with abundant compute, enabling complex analytics and long-term storage, but adding round-trip latency and WAN costs. Most production IoT systems combine both: the edge filters and summarises, and the cloud analyses and stores at scale.
mermaid
flowchart TD Sensors[Sensor array\nVibration / Temp / Pressure] --> Local[Local protocol\nMQTT / OPC-UA / Modbus] Local --> Buffer[Ingestion buffer\nEdge node memory] Buffer --> PreProcess[Pre-processing pipeline\nUnit conversion / band-pass filter] PreProcess --> Inference[Local ML inference\nTFLite / ONNX Runtime] Inference --> Score{Anomaly score\nabove threshold?} Score -->|No| Summarise[Aggregate to summary metric] Score -->|Yes| RawStore[Store raw event window\nEdge NVMe SSD] RawStore --> Alert[Send immediate alert\nto cloud platform] Summarise --> Batch[Batch summary records] Batch --> Upload[Periodic cloud upload\nHTTPS / MQTT-TLS] Alert --> Cloud[Cloud IoT Platform] Upload --> Cloud Cloud --> ModelRegistry[Model registry\nNew model available?] ModelRegistry -->|Yes| Download[Download and validate\nnew model artifact] Download --> HotSwap[Hot-swap model\non edge node] HotSwap --> Inference
Copied to clipboard