6 trends that will impact IoT strategies in 2020

Register now

This was the year industrial automation took the lead. From data analytics to IoT adoption, enterprises have acknowledged these technologies are essential to not only kick start, but also maintain digital transformation initiatives.

Although the industry has come far since I wrote this year’s predictions, enterprises still have lessons to learn when it comes to IoT projects. And what do they have to look forward to in 2020? A lot of focus on OT/IT, the edge, and even AI will be knocked off its pedestal.

IoT Declares Victory

IoT projects are often portrayed as the quickest way to get yourself fired with allegedly “close to three fourths of IoT projects failing.”

In reality, success is the norm. Gartner’s Eric Goodness conducted a poll that 57% are achieving outcomes better than anticipated with only 3% not meeting outcomes. Ian Hughes at 451 Research says around 8% have achieved negative ROI. Either way, we’re talking single digits.

Why the turnaround? Better technology for one, but also companies are scoping their projects more appropriately. Rather than boil the ocean, they are knocking off discrete goals like reducing maintenance.

The Pay Per View Business Model Arrives

Some large industrial customers track millions of data streams. Even a single wind turbine can generate over 93,000 signals a day. Traditional “by the gigabyte” cloud models thus don’t work. ARM has launched a program that lets designers experiment with the entire portfolio of their technology but only obligates them to pay for IP used in a final design. Likewise, some software companies have unleased programs where customers can store large amounts of data but only pay for what the use.

OT & IT Get Equal Focus

IT security has always been a priority, but OT security will require the same level of attention. OT security involves protecting manufacturing robots, refinery equipment, nuclear reactors, power substations and the like from intruders. There are large differences between OT and IT security breaches, and OT needs to improve its approach.

Microsegmentation will enter your vocabulary as well. Break-ins are inevitable. A new wave of research will concentrate on limiting where hackers can go once inside. In a semi-related trend, data diodes, which effectively allow data from machinery to be delivered to IT networks but prevent outside data from reaching machinery, will maybe become more widespread.

AI Gets Knocked Down a Notch

The new paradigm will be “AI exists for decision support.” Put another way, AI applications will sift data and provide individuals with a choice of options for a course of action. The decision support role fits better with the state of the technology—AI is promising but it’s still not as just as human judgment on many key problems. A supporting role also makes people more comfortable with AI.

3D Models Get Replaced by 4D Models

A four dimensional model is a 3D simulation cross-referenced with time. A 4D model of your own life would start at birth and trail, messy parts and all, all the way to the grave.

Digital twins will be essentially the first commercially widespread 4D models. Software from companies like KBC, Hexagon and Element essentially permit you to recreate the lifetime of an offshore oil platform or other complex asset and even extrapolate upon alternative futures. By 2023, everyone will working in the fourth dimension.

Edge Gets Sharper

Edge computing has gone from being an obscure way station for cloud data to the next great computing platform. Why? It’s too costly and time consuming to send everything to the cloud. Keeping data on edge devices and facilities can also ameliorate some privacy concerns: why wonder if AWS eliminated your private data as requested when you can just take a hammer to your Alexa. IDC and Gartner say that between 40 and 75% of data may exist outside of cloud data centers.

The big issue for 2020 will be “what the edge looks like?” Will microdata centers that serve city blocks become common? Are we going to see “edge clouds”? Will some tasks—predictive maintenance—become classic edge applications? Will latency or cost be more important in figuring out where to locate tasks? Four years from now the answer will seem obvious. In 2020, however, the debate will be just be beginning.

For reprint and licensing requests for this article, click here.