I’ll be posting on various aspects of McKinsey’s new “The Internet of Things: Mapping the Value Beyond the Hype” report for quite some time.
First of all, it’s big: 148 pages in the online edition, making it the longest IoT analysis I’ve seen! Second, it’s exhaustive and insightful. Third, as with several other IoT landmarks, such as Google’s purchase of Nest and GE’s divestiture of its non-industrial internet division, the fact that a leading consulting firm would put such an emphasis on the IoT has tremendous symbolic importance.
My favorite finding:
“Interoperability is critical to maximizing the value of the Internet of Things. On average, 40 percent of the total value that can be unlocked requires different IoT systems to work together. Without these benefits, the maximum value of the applications we size would be only about $7 trillion per year in 2025, rather than $11.1 trillion.” (my emphasis)
This goes along with my most basic IoT Essential Truth, “share data.” I’ve been preaching this mantra since my 2011 book, Data Dynamite (which, if I may toot my own horn, I believe remains the only book to focus on the sweeping benefits of a paradigm shift from hoarding data to sharing it).
I was excited to see that the specific example they zeroed in on was offshore oil rigs, which I focused on in my op-ed on “real-time regulations,” because sharing the data from the rig’s sensors could both boost operating efficiency and reduce the chance of catastrophic failure. The paper points out that there can be 30,000 sensors on an rig, but most of them function in isolation, to monitor a single machine or system:
“Interoperability would significantly improve performance by combining sensor data from different machines and systems to provide decision makers with an integrated view of performance across an entire factory or oil rig. Our research shows that more than half of the potential issues that can be identified by predictive analysis in such environments require data from multiple IoT systems. Oil and gas experts interviewed for this research estimate that interoperability could improve the effectiveness of equipment maintenance in their industry by 100 to 200 percent.”
Yet, the researchers found that only about 1% of the rig data was being used, because it rarely was shared off the rig with other in the company and its ecosystem!
The section on interoperability goes on to talk about the benefits — and challenges — of linking sensor systems in examples such as urban traffic regulation, that could link not only data from stationary sensors and cameras, but also thousands of real-time feeds from individual cars and trucks, parking meters — and even non-traffic data that could have a huge impact on performance, such as weather forecasts.
While more work needs to be done on the technical side to increase the ease of interoperability, either through the growing number of interface standards or middleware, it seems to me that a shift in management mindset is as critical as sensor and analysis technology to take advantage of this huge increase in data:
“A critical challenge is to use the flood of big data generated by IoT devices for prediction and optimization. Where IoT data are being used, they are often used only for anomaly detection or real-time control, rather than for optimization or prediction, which we know from our study of big data is where much additional value can be derived. For example, in manufacturing, an increasing number of machines are ‘wired,’ but this instrumentation is used primarily to control the tools or to send alarms when it detects something out of tolerance. The data from these tools are often not analyzed (or even collected in a place where they could be analyzed), even though the data could be used to optimize processes and head off disruptions.”
I urge you to download the whole report. I’ll blog more about it in coming weeks.