Key takeaways:

  • 21 billion connected IoT devices are active today. By 2030, the number will reach 39 billion.
  • AI is moving from the cloud to the device. Faster decisions, no network dependency.
  • No single connectivity technology fits all IoT. Mature deployments run Private 5G, LPWAN, RedCap, and satellite in parallel.
  • The EU Cyber Resilience Act is already affecting procurement. The first hard deadline is June 2026.
  • 70% of critical firmware vulnerabilities are memory safety errors. Rust eliminates them at compile time.
  • Predictive maintenance delivers verified results: 25% productivity gains, 70% fewer breakdowns, 25% lower maintenance costs.

21 billion connected devices were active by the end of 2025. By 2030, that number is expected to climb past 39 billion.

If you work in IoT, you’ve been watching these projections inch upward for years. By now, the size of the market is almost beside the point. The more useful question is what this scale actually demands from the people building and deploying connected systems, and whether the industry’s architecture assumptions are keeping up.

They largely weren’t.

For most of the last decade, the dominant pattern was collect-and-send: gather data at the device, ship it to the cloud, analyze it there, send back instructions. That model made sense when cloud compute was cheap and the volumes were manageable. In 2026, both of those conditions have changed. Device compute has become cheap enough to run machine learning models on microcontrollers costing a few dollars. And the volumes of data generated by industrial, healthcare, and logistics deployments have grown to a point where routing everything through a central server first introduces delays that are, depending on the application, operationally unacceptable.

The result is a structural shift that shapes nearly every other IoT technology trend in this analysis: intelligence is moving to the device, and the cloud is repositioning from decision-maker to coordinator. Understanding what drives that shift, and what it demands of firmware teams, connectivity architects, and security officers, is the thread running through the most consequential IoT trends of 2026.

Read on to see where each of these trends leads and more.

The market in numbers

Before getting into architecture, it’s worth anchoring the conversation in what the IoT market trends actually show. These figures set the context for the emerging trends and IoT technology trends covered below, because the numbers carry their own argument.

The spending picture

Enterprise IoT spending reached $298 billion in 2024, a 10% year-over-year increase, rising to roughly $324 billion in 2025. The global IoT market as a whole is estimated at approximately $1.35 trillion in 2025. Edge computing investment hit $261 billion in 2025 and is projected to reach $380 billion by 2028, growing at 13.8% annually according to IDC. That investment is already committed to manufacturing floors, hospital networks, logistics yards, and smart buildings, all running on interconnected devices operating today.

Where the real pressure shows up

The security numbers tell a different kind of story. The IoT security market is expected to grow from $8.7 billion in 2024 to $34.29 billion by 2029, according to The Business Research Company, nearly quadrupling in five years. That trajectory reflects a dawning recognition, accelerated by the EU Cyber Resilience Act, that security is no longer an optional feature layer. For manufacturers selling into European markets, it’s now a legal obligation with documented timelines and penalties reaching up to €15 million. The companies best positioned in 2026 are those who tracked these key developments early and built for them.

With 39 billion devices expected by 2030, the architecture decisions made now will be expensive to reverse.

We help IoT product teams get them right from the start.

Explore IoT Services

With that context in place, here are the 8 IoT trends shaping every architecture, security, and compliance decision in 2026.

Trend #1: AIoT

AIoT

The clearest way to explain what AIoT actually means, past the marketing layer, is to start with a problem that most people in industrial settings understand immediately.

A pump bearing starts failing at 2 a.m. on a Saturday (AIoT, the convergence of artificial intelligence and IoT, is the architecture designed specifically for this scenario). In the old collect-and-send architecture, the sensor generates data, sends it to the cloud, the cloud runs a model, identifies the anomaly, and sends an alert back down. That round-trip, under congested network conditions or unreliable connectivity, could take long enough that the delay becomes the difference between a planned maintenance window and an unplanned production halt.

How AIoT changes the picture

In the AIoT architecture, the model runs on the industrial controller or directly on the sensor node. The inference happens in milliseconds, locally, without touching the network. The device triggers a shutdown or alert on its own. The cloud’s role shifts: it collects logs from the entire fleet, retrains the model on aggregate patterns, and periodically pushes updated model weights back to the devices. The edge handles data processing and real time decision making; the cloud handles analyzing data across the entire fleet and coordinating model updates.

And that’s a fundamental reallocation rather than a minor optimization.

  • TinyML is the technology that makes this possible at the micro end of the spectrum. Running machine learning algorithms directly on edge devices with 256 kilobytes of RAM sounds implausible until you see it work: classifying a vibration pattern, flagging a sensor anomaly, or running computer vision on a production line camera without a single frame leaving the device. These are compact, task-specific models that do one thing well, locally, without connectivity or cloud infrastructure of any kind.

For engineering teams, the practical weight of this shift falls on firmware. An embedded system that needs to carry model artifacts, handle over-the-air model updates, and run inference within its power budget is considerably more complex than one that just collects and transmits data. It also needs to be written in a way that doesn’t introduce the memory vulnerabilities that make firmware a favorite attack vector, which is why the Rust conversation is inseparable from the AIoT conversation, and why both are covered in detail below.

Bring intelligence to the edge.

Discover how our artificial intelligence/machine learning services help run predictive models directly on IoT devices to cut cloud costs and reduce latency.

Explore AI/ML Services

Trend #2: Digital Twins

Digiral Twins

The phrase “digital twin” spent several years as a showcase technology, something that appeared in vendor demonstrations as a photorealistic 3D model rotating slowly on a screen, updated in real time from sensor feeds. It was impressive. For most organizations, it wasn’t particularly actionable.

In 2026, the technology has matured past the demonstration stage, and the more consequential development isn’t in the visualization layer at all.

Asset twins vs. process twins

What the industry calls an “asset twin,” a virtual representation of a single machine updated from its own sensors, is now well-established. What’s changing in 2026 is the expansion toward what Gartner calls a Digital Twin of an Organization (DTO): a dynamic model that uses operational and contextual data to represent how an entire business process executes, not just how a single machine performs.

The difference in practical terms is significant. Knowing that a specific press is running 3% slower than specification is useful. Understanding what that slowdown means for order fulfillment at the end of the week, which downstream processes it affects, and what the optimal response looks like before anyone physically checks the equipment, that’s a different category of capability entirely.

What the numbers actually show

Engineers testing process changes in a virtual environment before touching the factory floor is the concrete version of this. Deloitte’s Analytics Institute documents average downtime reductions of up to 30% and production output improvements of up to 25% in organizations applying predictive analytics and data analytics through digital twin programs.

Siemens’ Electronics Works Amberg facility, one of the most-cited examples of a mature digital factory, operates at approximately 99.999% production quality, sustained through continuous feedback loops between physical systems and their virtual counterparts. Those metrics have been stable for years. These smart factories deliver outcomes that are durable, not a short-lived deployment effect. For industrial IoT teams, that durability is the argument for moving from pilot to fleet-wide deployment.

What makes these outcomes possible is the convergence of real-time IoT sensor data feeding the twin, edge AI ensuring the twin reflects machine state in near-real-time, and cloud platforms handling the simulation layer that makes the twin useful for decisions rather than just monitoring. Remove any one of those and the value of the others degrades.

Organizations still treating digital twins as a monitoring investment are leaving the most valuable capability on the table. The value in 2026 is simulation: analyzing data from the physical system before committing to any change in it.

Trend #3: EU Cyber Resilience Act

EU Cyber Resilience Act

The attack surface in IoT grows every time a new device goes online. For years, managing that was left to the market. In Europe, that’s no longer the case.

What the CRA actually requires

The EU Cyber Resilience Act formally entered into force on December 10, 2024. Its effects on IoT procurement are already visible, and manufacturers cannot wait out the full compliance deadline of December 11, 2027. The intermediate deadlines are creating immediate commercial consequences right now.

Key CRA deadlines:

  • June 11, 2026 — Conformity-assessment provisions take effect
  • September 11, 2026 — Vulnerability reporting obligations activate
  • December 11, 2027 — Full compliance required

💡 Good to know
Compliance requires: a minimum 5-year vulnerability support period, 10-year security update availability, end-of-support dates specified at point of sale in month and year, mandatory risk assessment, and vulnerability disclosure.

Non-compliance carries penalties of up to €15 million or 2.5% of worldwide annual turnover, whichever is higher.

What this means in the procurement room

European buyers, manufacturers, healthcare systems, logistics operators, defense contractors, are beginning to require CRA compliance documentation as a qualification criterion. A vendor who cannot produce a documented 10-year update commitment and a completed security concept is not qualifying for bids, regardless of their product’s technical merits.

This creates a visible market separation. Vendors who have invested in compliance infrastructure will find it a competitive advantage. Those who haven’t will find it an exclusionary barrier.

Trend #4: Firmware-level security and the rise of Rust

Firmware-level security and the rise of Rust

Zero Trust addresses how devices communicate across a network. But there’s a more foundational layer in IoT technology where a substantial share of vulnerabilities actually originate: the firmware itself.

Approximately 70% of high-severity security vulnerabilities in C and C++ software are the result of memory safety errors: buffer overflows, use-after-free bugs, null pointer dereferences. Chromium’s security team documented this from their own codebase. CISA and the NSA have both cited the figure in formal advisory materials recommending memory-safe languages.

This matters for IoT because embedded systems, including sensors, controllers, edge nodes, and gateways, are overwhelmingly written in C or C++. The firmware on the industrial controller monitoring a production line. The code on the medical wearable tracking cardiac arrhythmias. The software on the automotive ADAS module making steering decisions. Which means roughly 70% of critical vulnerabilities in those systems are, in principle, preventable with a language change.

Why Rust is the answer the industry is converging on

Rust is the language that has made this argument practically actionable rather than theoretically appealing. It’s a systems programming language with performance competitive with C, designed specifically to prevent memory safety vulnerabilities at compile time. The compiler rejects code that would produce a buffer overflow or a use-after-free error. No garbage collector, no runtime overhead, just a compiler that enforces safety constraints the programmer in C has to enforce manually, and frequently doesn’t.

In 2026, Rust for IoT is in production use in safety-critical domains. The Rust project has documented deployments in mobile robotics and medical devices under relevant safety standards. Automotive OEMs and Tier 1 suppliers are adopting Rust for ADAS firmware precisely because a memory safety vulnerability in a lane-keeping or automatic braking system has physical consequences. The NSA has explicitly recommended memory-safe languages, with Rust as the leading candidate for systems programming applications.

For IoT product teams, the decision calculus is clearer than it was three years ago. Migrating existing large C codebases is a genuine engineering investment, and the CRA doesn’t mandate a specific language, it mandates documented security outcomes. But for new firmware projects, particularly in regulated domains, starting in Rust rather than C is increasingly the default choice among teams that have done the analysis. The tooling has matured, the embedded support has expanded, and the argument for accepting preventable vulnerability classes in new code is getting harder to sustain.

Explore our Rust development services for memory-safe, CRA-compliant firmware.

Rust in embedded is our core technical differentiator in 2026, and one of the sharpest ways to demonstrate security posture to enterprise buyers.

Explore Rust Development Services

Trend #5: 5G and Connectivity 2.0

5G and Connectivity 2.0

Smart devices still need reliable connections. Choosing the right connectivity technology for each device category is one of the most consequential architecture decisions an IoT team makes, and in 2026 there is no single right answer.

The approach that works is matching each device type to the technology that fits its specific requirements: latency, power consumption, coverage, and data volume.

In 2026, mature deployments run several connectivity options in parallel.

  • Private 5G is getting the most boardroom attention in manufacturing and logistics, and for good reason. The underlying IMT-2020 standard specifies device densities of up to one million devices per square kilometer and latencies as low as 1 millisecond, specifications that matter enormously when coordinating automated guided vehicles, safety interlock systems, and quality control cameras within a single production facility. A factory network running on enterprise Wi-Fi simply wasn’t designed for that kind of density or determinism.
  • 5G RedCap (Reduced Capability, standardized in 3GPP Release 17) is the underappreciated middle of the market. It was designed specifically for devices that need more capability than NB-IoT but can’t accommodate the power consumption of a full 5G modem. Industrial wearables, asset trackers, environmental monitors, any device that needs to run for months on a battery while communicating reliably, are exactly the target profile. In 2026, expect RedCap to appear in product specs the way NB-IoT did four years ago: first in industrial settings, then more broadly.

💡 Good to know
5G RedCap doesn’t replace LPWAN. It fills the gap between low-power wide-area networks and full 5G. If your devices are transmitting small amounts of data infrequently, LPWAN is still almost always the right choice on cost and battery life.

  • LPWAN, covering LoRaWAN and NB-IoT, is still the default technology for massive IoT deployments. GSMA reports that NB-IoT and LTE-M connections surpassed 1 billion worldwide. The reason is simple: most IoT devices don’t need high bandwidth. They need to send small amounts of data reliably for years without a battery change. The LoRa Alliance documents field-proven battery life of 10 to 15 years in appropriate deployment profiles. That’s the reason cold chain sensors, smart meters, and agricultural monitors are built on LPWAN rather than cellular.
  • Satellite IoT has crossed from novelty to practical option. Deutsche Telekom launched what it describes as the world’s first multi-orbit IoT roaming service, combining terrestrial NB-IoT coverage with GEO and LEO satellite backhaul for remote regions. For a container ship crossing the Pacific, or a mining asset operating 200 kilometers from the nearest cell tower, this removes the last remaining argument for accepting dead zones as inevitable. Transportation and logistics already represents 26% of global 5G IoT connections according to IoT Analytics, a figure that reflects how seriously mobility-intensive industries are investing in connectivity infrastructure.

Trend #6: Green IoT and energy harvesting

Green IoT and energy harvesting

In buildings, IoT-enabled management systems that monitor energy usage have demonstrated energy consumption reductions of up to 30%, according to systematic reviews of smart building deployments. More conservative estimates place the range at 10 to 20% for general implementations. At commercial real estate scale, that represents material changes in operating cost and, increasingly, in ESG reporting obligations that are no longer voluntary in major markets.

In agriculture, precision irrigation systems using real-time soil moisture sensors have reduced water use by 40 to 50% compared to traditional fixed-schedule irrigation, depending on deployment context. In regions where water is both scarce and regulated, that’s an operational necessity rather than a sustainability bonus.

  • Energy harvesting is where the Green IoT story becomes architecturally significant. The ability to power a sensor from ambient light, vibration, radio frequency signals, or thermal differentials, without a battery or a wired power connection, changes the economics of dense sensor deployment fundamentally. Logistics pallets, ATEX-rated hazardous zone sensors, agricultural field monitors, and structural health monitoring nodes can all operate indefinitely without maintenance, provided the energy harvesting design is matched to the ambient energy profile of the environment.

    The limiting factor in large-scale sensor deployments has historically been maintenance cost: replacing batteries across thousands of geographically distributed devices. Energy harvesting removes that constraint entirely.

💡 Pro tip
Energy harvesting works best when the harvesting method matches the deployment environment from day one. Vibration harvesting on rotating equipment, solar on outdoor or well-lit assets, RF harvesting near high-power transmitters. Matching the source to the environment is the design decision most often left too late in a project.

Trend #7: Zero Trust for OT/IoT environments

Zero Trust for OT/IoT environments

The CRA (covered in Trend #3) defines what manufacturers must commit to over a product’s lifetime. Zero Trust defines how networks should be architected once those devices are deployed (specifically how to protect IoT devices handling sensitive data from lateral movement after a breach). The two are, in a meaningful sense, responses to the same underlying problem.

  • Microsegmentation is the practical mechanism that makes Zero Trust meaningful in IoT environments. The logic is containment: if a device is compromised, it should be unable to communicate with any system outside its defined segment. A compromised smart sensor in a building management system should not have a pathway to the ERP. A breached network camera in a logistics facility should not be able to reach industrial controllers on the same site. These aren’t hypothetical attack chains. They’re documented in real-world incidents where compromised low-value IoT devices served as entry points into high-value operational systems.

💡 Good to know
Implementing Zero Trust in a brownfield OT environment is significantly harder than in a greenfield deployment. Legacy devices often can’t support modern authentication methods, which means the segmentation architecture has to compensate for what the device itself can’t enforce. Plan for this constraint early, because retrofitting network architecture around device limitations is a major source of scope creep in OT security projects.

In 2026, microsegmentation in OT environments is moving from advanced hardening option to baseline expectation in procurement requirements. Organizations with IEC 62443 compliance requirements, which is increasingly every industrial buyer, are specifying network segmentation capabilities as non-negotiable. For new deployments, there’s no longer a reasonable argument for not building it in from the start.

Zero Trust architecture for OT/IoT environments requires implementation expertise that goes beyond framework knowledge.

Explore IoT Security Services

Trend #8: AI-powered threat detection

AI-powered threat detection

AI is changing the attacker side of the equation in practical, documented ways. Google Cloud’s threat intelligence team explicitly forecasts that AI will accelerate the attacker-defender dynamic in 2026, compressing the window between vulnerability discovery and exploitation. Parts of the attack workflow that previously required significant human expertise, reconnaissance, vulnerability scanning, initial payload development, can now be partially automated. This doesn’t make every threat actor more capable, but it does mean the time from “vulnerability disclosed” to “active exploitation in the wild” is shorter than it was. Defenders running on periodic review cycles rather than continuous monitoring are increasingly exposed.

ENISA’s 2025 Threat Landscape report identifies AI systems and their infrastructure as introducing new attack surfaces, a concern specifically relevant for AIoT deployments, where on-device models become a potential target alongside the traditional firmware and network attack surfaces. NIST has published draft guidance on securing AI systems and using AI to enhance security operations, which reflects where the defensive capability is going.

Why scale makes human-only monitoring impossible

For IoT environments, AI-powered threat detection addresses a scale problem that simply has no human solution. No security team can establish behavioral baselines for 50,000 devices and monitor for meaningful deviations in real time. Anomaly detection systems that model normal device behavior and flag deviations automatically are the practical answer. Automated response playbooks that can isolate a compromised device faster than a human analyst can open the alert are becoming standard components of mature IoT security architectures in 2026, not advanced implementations.

💡 Pro tip
When evaluating AI threat detection platforms for IoT environments, prioritize behavioral baselining per device category over generic network-level anomaly detection. A smart meter and a SCADA controller have entirely different normal traffic patterns. A platform that baselines them identically will produce too many false positives to be operationally useful.

The budget implication is direct: AI-powered detection tooling is not optional for organizations running large IoT deployments in regulated environments. It’s the only way to maintain meaningful visibility at the device counts that 2026 deployments involve.

IoT in practice: industry breakdowns

With the horizontal trends established, edge AI, connectivity tiering, security architecture, and regulation, it’s worth looking at how the key industrial IoT trends land across the industries furthest along in deploying them.

Manufacturing and industrial IoT

Manufacturing is a good place to start, because the proof points are quantified and verified by organizations with no incentive to overstate them.

As we already mentioned before, Deloitte’s Analytics Institute reports average benefits from predictive maintenance programs: a 25% increase in productivity, a 70% reduction in unplanned breakdowns, and a 25% reduction in maintenance costs. Equipment uptime improvements of 10 to 20% are the more conservative documented range. These figures are averages across programs of varying maturity, which means they include organizations still in early deployment. For mature programs in facilities with high instrumentation density, the outcomes trend toward the top of those ranges.

  • IEC 62443 compliance deserves a specific note, because its role in manufacturing processes is changing fast. Two years ago, IEC 62443 certification was a best-practice signal, a sign that a vendor took industrial cybersecurity seriously. In 2026, it’s becoming a non-negotiable procurement gate. Industrial buyers, driven by their own insurance requirements and by the CRA’s downstream effects on supply chain expectations, are specifying IEC 62443 compliance as a qualification criterion. Vendors who don’t hold it are being disqualified before technical evaluation even begins.
    Private 5G on the factory floor, combined with AGVs and robotic systems requiring sub-10-millisecond communication latency, is the connectivity architecture that makes real-time production coordination at that level of precision possible. For smart factories running at the precision Siemens Amberg demonstrates, this connectivity layer is as foundational as the sensor network itself (as discussed earlier in Trend #3)

From predictive maintenance to Private 5G integration, our solutions are built for the compliance and latency requirements manufacturing buyers specify in 2026.

Explore Industrial IoT Solutions

Healthcare IoT (IoMT)

IoT trends in healthcare operate under a specific set of constraints that make it distinct from industrial IoT, even when the underlying technologies are similar. The consequences of a system failure are measured not in production downtime but in patient outcomes. And the interoperability expectations have hardened into procurement requirements with real disqualifying power, which shapes every architecture decision in the sector.

Healthcare IoT has the most demanding compliance stack in the industry. We build for it.

See Our Healthcare IoT Work

FHIR and the interoperability baseline

The interoperability story centers on FHIR (Fast Healthcare Interoperability Resources), the modern standard for exchanging health data via REST APIs and common data formats, developed by HL7. U.S. policy through the Cures Act Final Rule has pushed the healthcare ecosystem toward standardized APIs specifically to enable structured patient data access and address information blocking. In practical procurement terms, a connected medical device that cannot integrate with major EHR platforms, Epic, Cerner, and equivalents, via FHIR is at a serious disadvantage in any institutional sales process. FHIR interoperability is table stakes in 2026, not a differentiating feature.

Remote monitoring and the Hospital at Home model

Remote Patient Monitoring has crossed from pilot program into standard care pathway in a growing number of health systems. Wearable technology tracking heart rate, blood pressure, and SpO₂ continuously and feeding data into EHRs via FHIR integrations is enabling Hospital at Home programs and chronic condition management at a scale that was practically impossible with manual monitoring. The reduction in post-discharge hospitalizations from continuous RPM represents both a quality of care improvement and a meaningful capacity benefit for systems under sustained bed pressure.

  • On-device arrhythmia detection is where TinyML’s clinical significance becomes most apparent. A wearable running inference locally can detect a dangerous cardiac event and trigger an alert in milliseconds, without a cloud round-trip. In a scenario where a patient’s atrial fibrillation episode begins while they’re asleep and miles from a clinical facility, that latency difference is clinically meaningful. This is Software as a Medical Device (SaMD) in its most operationally critical form.
    The FDA’s 2026 cybersecurity guidance ties security requirements directly to the premarket submission process for connected devices, not a post-certification add-on. For manufacturers navigating both FDA and EU MDR requirements, the overlap with CRA obligations creates a demanding but coherent compliance framework: both push toward documented update lifecycles, vulnerability disclosure, and security-by-design practices.

Logistics IoT

In healthcare, the primary challenge is compliance. In logistics, the primary challenge is visibility, specifically the persistent problem of assets going dark the moment they leave a monitored facility and enter the broader supply chain.

Closing the coverage gap

An asset that leaves a warehouse can effectively disappear for days or weeks, surfacing only when it reaches a node with connectivity, a scanner, or a manual check-in process. The 2026 state of the technology has closed most of that gap. But the remaining challenges are instructive about where the real complexity lives.

Real-Time Location Systems for warehouse and yard asset tracking are mature infrastructure for large distribution centers. Cold chain monitoring for pharmaceutical and food shipments, FDA and GxP-compliant sensor networks combining temperature logging with LPWAN connectivity, has been automated to the point where a cold chain sensor can track a medication shipment from manufacturer to final delivery without a battery change, transmitting condition data continuously throughout a multi-week journey.

The satellite IoT story closes the larger coverage gap. LEO constellations have changed the economics of global asset visibility enough that Deutsche Telekom’s multi-orbit IoT roaming service, combining terrestrial NB-IoT with satellite backhaul, is a commercially available, production-grade solution rather than a pilot. For container shipping and any supply chain that includes maritime or remote overland segments, the argument for accepting connectivity dead zones has essentially evaporated.

The integration problem that’s still not solved

What hasn’t been fully solved is the integration challenge, and it’s worth being specific about why. A single shipment passing through international logistics may generate data from 100 or more carrier systems, each with its own data format, API structure, and update frequency. Aggregating that into a coherent, real-time picture requires middleware that doesn’t exist as a commodity product. It requires custom development, ongoing maintenance as carrier systems change, and organizational investment in data standardization that logistics operators frequently underestimate.

The sensor and connectivity infrastructure is mature. The data integration layer is where deployments consistently hit friction.

Automotive IoT

From logistics, it’s a short conceptual step to automotive. Both industries share the core challenge of managing connected assets moving continuously through varied environments. The difference is that in automotive, the stakes are substantially higher when something goes wrong, because those assets are carrying people.

Connectivity architecture for the connected vehicle

Connected vehicles are, from a systems engineering standpoint, among the most demanding IoT environments in existence. A modern connected car is simultaneously a safety-critical embedded system, a mobile data center, a software product requiring over-the-air update capability for its entire operational life, and, in the case of autonomous vehicles, an independent decision-maker operating in an environment it shares with humans.

5G RedCap is the connectivity architecture that best fits the automotive middle tier. Real-time telematics, infotainment connectivity, and OTA firmware updates all benefit from 5G bandwidth and latency without requiring the power consumption of a full 5G modem in every subsystem. OTA update capability is a baseline expectation in connected vehicles in 2026, both for functional improvements and for security patch delivery under the CRA’s 10-year update availability requirement. A vehicle sold in Europe must have a credible, documented plan for receiving security updates for at least a decade after sale.

Rust in ADAS

Rust’s adoption in automotive firmware development is one of the clearest indicators of how seriously the safety-critical embedded space is taking the memory safety problem. ADAS systems, the software making steering, braking, and collision avoidance decisions, require code that simply does not produce the class of vulnerabilities that memory-unsafe languages routinely generate. The transition from C/C++ to Rust in new ADAS firmware projects is underway at Tier 1 suppliers and OEMs, and the direction is not reversing.

💡 Good to know
V2X (Vehicle-to-Everything) communication, enabling vehicles to exchange data with road infrastructure and other vehicles, is advancing toward commercial standardization via C-V2X, which builds on 5G standards. In 2026, V2X remains in early commercial deployment rather than broad market availability, but automotive IoT platforms being designed today need to account for it as a near-term requirement.

Defence and military IoT

Military IoT market trends follow a different trajectory from commercial IoT. Military IoT concentrates every other challenge in this analysis to its most demanding expression, strips away commercial flexibility, and adds constraints, operating reliably in complex environments where electronics face active interference, that no civilian deployment faces.

An unusual combination of constraints

The devices need to operate reliably in environments actively hostile to electronics: extreme temperatures, physical shock, jamming, and adversarial exploitation attempts. They need to do this indefinitely, in locations where no maintenance or resupply is available. And every piece of data they generate is subject to sovereignty requirements that preclude commercial cloud infrastructure entirely.

If that sounds like an unusual combination of constraints, it’s because it is.

Edge AI when there’s no connectivity to rely on

Edge AI trends in military applications follow a different constraint set than anywhere else in this analysis. Edge AI for field intelligence is not optional in disconnected military environments. A sensor network deployed at a forward position needs to process what it observes, classify it, and make decisions, whether to trigger an alert, activate a response system, or log an event, without assuming connectivity to a command network. The same edge inference architecture powering industrial predictive maintenance is being adapted for autonomous classification in field sensors, with the added constraints of anti-tamper design, power management under extreme conditions, and hardening against adversarial interference with the model itself.

Zero Trust and data sovereignty as hard requirements

Zero Trust and microsegmentation in military OT networks carry stakes that make the civilian industrial case look modest. A compromised node in a military tactical network that can reach command-and-control systems is not an operational disruption. It’s a mission failure with physical consequences. The IEC 62443 framework provides a relevant baseline; military networks adapt and extend it with classification-appropriate additional controls. The underlying principle is the same: no device should be trusted by virtue of network location.

Data sovereignty requirements eliminate most commercial IoT platform options for defence applications. IoT data generated by military assets, location, sensor readings, operational status, communications metadata, cannot transit commercial cloud infrastructure or be stored in systems subject to foreign jurisdiction. This pushes defence IoT deployments toward sovereign cloud environments or fully on-premises processing, with architecture designed from the start to contain data within national infrastructure.

Energy harvesting as an operational enabler

Energy harvesting for forward-deployed sensors is not a cost optimization in this context. It’s an operational enabler. A sensor in a remote surveillance position that requires a battery change every six months is a sensor that requires a human to return to that position every six months, with all the exposure and logistics that implies. A sensor that harvests power from solar, vibration, or thermal differentials and operates indefinitely without intervention is a fundamentally different capability.

What the trends add up to

Taken together, these IoT industry trends (and the iot trends by industry breakdowns above) point to the same underlying tension across all IoT systems: IoT technology has matured faster than the security and governance layer built around it. What the broader trends in IoT confirm is that devices are cheaper, smarter, and more connected than they were five years ago. The infrastructure to deploy them at scale, including connectivity, edge compute, and digital twin platforms, has crossed from experimental to commercial. What hasn’t kept pace, until regulatory pressure began to force the issue, is the security lifecycle management and architectural discipline required to operate billions of connected devices where a compromise has real consequences.

The CRA is the clearest signal that this gap is no longer acceptable. The Zero Trust and firmware security trends are the technical responses to the same underlying problem. And the AIoT shift, intelligence moving to the device, concentrates security risk at the firmware level in ways that make Rust and memory-safe development practices directly relevant.

For IoT decision-makers in 2026, the questions worth the most attention are:

  • What is our update lifecycle commitment for every device category we ship?
  • Where does each device category sit in our network segmentation architecture?
  • What is our firmware development practice for the next generation of on-device AI?

These are harder questions than connectivity selection or platform evaluation. But they’re the ones where the answers will determine both regulatory compliance and operational resilience over the next five years.

Don’t risk compliance in 2026

Book an IoT architecture and security strategy session with our R&D team and walk away with a clear picture of where your product roadmap stands against CRA, IEC 62443, and FDA cybersecurity requirements.

Book an IoT Architecture & Security Strategy Session

FAQ

What are the most important IoT trends in 2026?

The most critical IoT trends 2026 brings center on three shifts: AI moving to the device, tighter connectivity architecture (Private 5G + LPWAN + satellite running in parallel), and the EU Cyber Resilience Act forcing security from optional to mandatory. Security runs through all of it: Zero Trust for OT networks, Rust for firmware, AI-powered anomaly detection. With 21 billion connected devices today and 39 billion projected by 2030, the decisions made now will shape compliance and operational outcomes for years.

What is the IoT market size in 2025 and where is it heading?

The global IoT market sits at roughly $1.35 trillion in 2025. Enterprise IoT spending reached $298 billion in 2024, up 10% year-over-year. The fastest-growing segment is IoT security, projected to nearly quadruple from $8.7 billion in 2024 to $34.29 billion by 2029. Edge computing investment is on track to hit $380 billion by 2028. The growth is real and already committed, not speculative.

What is the EU Cyber Resilience Act and how does it affect IoT manufacturers?

The CRA entered into force December 10, 2024, and imposes binding security requirements on connected device manufacturers selling in Europe. You need a minimum five-year vulnerability support period, 10-year security update availability, and a documented end-of-support date at point of sale. The first deadlines land in 2026, with full compliance required by December 11, 2027. Penalties reach €15 million or 2.5% of global turnover. European buyers are already treating compliance documentation as a qualification gate, not a bonus.

Why are IoT security trends pointing toward Rust and memory-safe languages?

About 70% of high-severity firmware vulnerabilities are memory safety errors — buffer overflows, use-after-free bugs — that a memory-safe compiler simply won’t allow. Most IoT firmware is still written in C or C++, which means most of that risk is preventable. Rust for IoT gives you C-level performance without the vulnerability class, and the tooling for embedded development has matured considerably. For new firmware projects in regulated domains, starting in Rust is increasingly the default choice.

What are the main IoT connectivity trends in 2026?

No single protocol fits all IoT, and mature deployments reflect that. The standard architecture now combines Private 5G for high-density factory environments, LPWAN (LoRaWAN, NB-IoT) for battery-powered devices sending small data over years, 5G RedCap for mid-tier industrial wearables and trackers, and satellite IoT for assets beyond terrestrial coverage. IoT connectivity trends in 2026 point firmly toward multi-protocol design from the start, not protocol lock-in.

What are the key industrial IoT trends in manufacturing?

Predictive maintenance is delivering verified numbers: 25% productivity gains, 70% fewer unplanned breakdowns, 25% lower maintenance costs (Deloitte Analytics Institute). Digital twins are moving from asset monitoring into process simulation. Private 5G is enabling real-time AGV and robotics coordination at sub-10ms latency. And IEC 62443 compliance is no longer a nice-to-have: many industrial buyers are disqualifying vendors who don’t hold it before technical evaluation even starts.

What are the current IoT trends in healthcare?

Three things define IoT trends in healthcare right now. FHIR interoperability is table stakes: devices that can’t connect to EHR platforms via HL7 FHIR APIs struggle in institutional procurement. Remote Patient Monitoring has moved from pilot to standard care pathway, enabling Hospital at Home programs at real scale. And on-device AI is enabling clinical-grade arrhythmia detection directly on wearables, without a cloud round-trip, which matters when latency is a clinical variable. FDA cybersecurity requirements now make security a premarket submission requirement, not an afterthought.

How does edge AI change IoT architecture?

It shifts where decisions are made. Instead of sending data to the cloud for analysis and waiting for instructions back, the device runs inference locally and acts in milliseconds. TinyML makes this viable even on microcontrollers with 256KB of RAM. The cloud’s role becomes coordination: aggregating fleet data, retraining models, pushing updated weights. For time-sensitive applications — industrial safety, cardiac monitoring, autonomous vehicles — that latency difference is the whole point. Edge AI trends in 2026 make this the default architecture for new deployments, not an advanced option.

What is Zero Trust and why does it matter for OT/IoT environments?

Zero Trust means no device is trusted by default, regardless of where it sits on the network. Every access request gets authenticated and authorized. In OT/IoT environments, where the network might include 20-year-old PLCs alongside modern sensors, that matters: you can’t assume something is safe just because it’s “inside.” Microsegmentation puts each device type in its own zone so a compromised sensor can’t reach a control system. IEC 62443 is the primary framework for implementing this in industrial settings, and it’s increasingly a procurement requirement rather than a best practice.