diagram.mmd — flowchart
IoT Device Data Flow flowchart diagram

An IoT device data flow diagram shows the end-to-end journey that a raw sensor reading takes from physical measurement to actionable insight, covering collection, local processing, transmission, cloud ingestion, storage, and visualization.

In a typical deployment, a sensor — such as a temperature probe, vibration detector, or motion sensor — continuously captures physical measurements at a configured sampling rate. The embedded firmware on the microcontroller reads these values and performs immediate sanity checks: is the value within the expected physical range? Is the reading consistent with the previous sample? Invalid readings are dropped or flagged before anything else happens.

Readings that pass validation reach the edge device layer, where local preprocessing runs. This step may apply calibration offsets, convert raw ADC counts to SI units, or run a simple threshold comparison. The threshold check is critical for bandwidth management: only readings that exceed a meaningful delta or cross a configured alert boundary need to be forwarded upstream. Routine stable readings can be aggregated locally and sent as a summary at a lower frequency.

Data that must be forwarded travels to an IoT gateway — often a local hub running on a Raspberry Pi, industrial PC, or dedicated gateway appliance. The gateway consolidates readings from many devices, applies a second layer of filtering, and translates protocols. Devices on the local network may speak MQTT over TCP, CoAP over UDP, or even a proprietary serial protocol; the gateway normalises these into a single HTTPS or MQTT-over-TLS stream toward the cloud.

The cloud IoT platform (AWS IoT Core, Azure IoT Hub, Google Cloud IoT, or a self-hosted broker) receives the stream, authenticates device identity via mutual TLS or pre-shared keys, and fans the data out to downstream consumers. Time-series data lands in a purpose-built store such as InfluxDB, TimescaleDB, or a managed service. Simultaneously, a rules engine evaluates incoming messages and fires alerts — notifications to on-call engineers, commands back to the device, or entries into an incident management system.

Finally, the stored data feeds an analytics dashboard where operators visualise trends, configure thresholds, and investigate anomalies. For the ingestion side of this pipeline in more detail, see IoT Sensor Data Pipeline. For what happens at the edge before data leaves the device, see IoT Edge Processing. For the gateway layer specifically, see IoT Gateway Architecture. For a broader cloud-side analytics view, see IoT Telemetry Pipeline.

Free online editor
Edit this diagram in Graphlet
Fork, modify, and export to SVG or PNG. No sign-up required.
Open in Graphlet →

Frequently asked questions

An IoT device data flow is the end-to-end path a measurement takes from a physical sensor through edge firmware, a gateway, a cloud ingestion service, a time-series database, and finally a visualisation or alerting layer. Mapping this flow explicitly helps teams identify where data can be lost, delayed, or corrupted.
A sensor produces a raw reading that firmware validates and preprocesses locally. The edge device forwards significant readings to a gateway, which normalises protocols and buffers messages. The cloud platform authenticates the gateway, ingests the stream, fans data to storage and a rules engine, and surfaces results on dashboards. Each hop adds latency and introduces failure modes that must be handled.
Use this diagram type during system design to align stakeholders on the end-to-end architecture, during incident investigation to trace where a missing reading was dropped, and when onboarding engineers who need to understand the full pipeline before working on a specific component.
Common mistakes include sending every raw sample to the cloud instead of aggregating or filtering at the edge, omitting backpressure handling so a slow consumer blocks ingestion, designing for a single protocol when the device fleet is heterogeneous, and not accounting for devices that go offline and reconnect with a backlog of buffered messages.
Device data flow covers the full journey from physical sensor to dashboard, including firmware-level steps and gateway protocol translation. A telemetry pipeline focuses specifically on the cloud-side infrastructure — broker, stream processor, time-series store, and alerting — that receives and routes already-transmitted messages. The two diagrams are complementary: device data flow ends where the telemetry pipeline begins.
mermaid
flowchart TD Sensor[IoT Sensor\nTemperature / Motion / Vibration] --> Read[Read raw ADC value] Read --> Validate{Valid range?} Validate -->|No| Drop[Drop invalid reading] Validate -->|Yes| Calibrate[Apply calibration offset\nConvert to SI units] Calibrate --> Threshold{Exceeds threshold\nor delta?} Threshold -->|No| Aggregate[Aggregate locally\nfor periodic summary] Threshold -->|Yes| Gateway[IoT Gateway\nLocal hub / broker] Aggregate -->|Periodic batch| Gateway Gateway --> Protocol[Protocol translation\nMQTT / CoAP → HTTPS] Protocol --> Cloud[Cloud IoT Platform\nAWS IoT / Azure IoT Hub] Cloud --> Auth{Device authenticated?} Auth -->|No| Reject[Reject connection] Auth -->|Yes| Ingest[Ingest message stream] Ingest --> TSDB[(Time-series database\nInfluxDB / TimescaleDB)] Ingest --> Rules[Rules engine\nAlert evaluation] Rules -->|Threshold breach| Alert[Send alert\nPagerDuty / SNS] TSDB --> Dashboard[Analytics dashboard\nGrafana / custom UI]
Copied to clipboard