The Internet of Things (IoT) refers to a world where aircraft engines, fridges, thermostats and medical devices are connected to the internet with a view to automating more of our lives, saving energy or making industrial processes run more efficiently.
It is a world where an aeroplane can detect a fault in one of its engines in mid-flight and radio ahead to order spare parts without any pilot intervention, and where your fridge can detect you are running out of milk and order it for you while you are asleep. Or where your home thermostat automatically lowers the temperature when you leave the house and raises it the moment you return. Or where a patient can be forewarned of an imminent heart attack by his smartphone without any kind of medical examination by a doctor.
IoT has six broad market segments: connected cars, automated homes, the industrial internet, wearable technology, smart metering, and connected stores.
Why does the Internet of Things matter for business?
The Internet of Things is one of the most significant investment themes of our generation. It already affects every industry and almost every person in the developed world. Cisco estimates the market for Internet-connected consumer gadgets to be worth $19 trillion by 2020 as more than 50 billion smart “things” come into operation over the next six years. On top of that, the Industrial Internet could add another $10tn to $15tn to global GDP over the next 20 years, according to General Electric.
But the market for the Internet of Things can only take off once all connected devices speak the same language. This has kicked off an intense battle for domination of the communications protocol that will govern the IoT. This communications protocol is likely to have two components: a single interconnection standard and a common language.
The interconnection standard must be low cost, low power and wireless if it is to be readily acceptable worldwide. The range is also an issue. Wi-Fi and Bluetooth are all short range wireless technologies suitable for the home or office, but they may not be able to provide sufficient coverage for applications in the automotive, healthcare and logistics industries.
For all the world’s connected devices to understand each other they must be able to interpret different operating systems. As a result, the world’s leading technology and telecoms players are all vying for influence in the standards debate.
As long as there is no clear standard, manufacturers who produce connected devices, say, a washing machine will find it difficult to determine which communications chip to build in. When all the dust has settled, we expect two common IoT standards to emerge, one for the consumer internet and another for the industrial internet.
What are some of the big themes around the Internet of Things?
In the past, component suppliers tended to be squeezed whilst the branded manufacturers of physical “things” such as aircraft engines, cars, fridges, washing machines and watches used to take the cream of industry profits, because they controlled the entire production process. As we move towards IoT, even the leading manufacturers will get squeezed at both ends of the value chain. At the input end, much of the value is moving towards “smart” component makers who make the sensors and microcontrollers that enable connectivity. At the output end, much of the value is moving towards the cloud services companies who control the apps that provide after-sales services and the smart hubs that monitor the device.
A major source of the streaming data that powers IoT comes from sensors. Connected things will carry a host of sensors, including accelerometers, heat and humidity components, pressure components, cameras and microphones. Some industries, like automobiles and manufacturing, will move faster than others, although retail, utilities and logistics are catching up rapidly. In the early years of this transition, the sensor-makers may see competitive power shift to them as it has done in the automobile sector.
There is a trend towards multiple sensor capabilities in a single connected device, so microcontroller units, which tell these sensors what to do, are getting more complex.
There is no common interconnection standard for IoT. As a result, manufacturers who wish to connect their products are forced to use communications chips with several wireless technologies, including 3G, 4G, Wi-Fi, Bluetooth, and iBeacon. Low Power Wide Area (LPWA) technologies also play an important role in opening up new low-bandwidth IoT applications across all major verticals.
AI and machine learning
Data collection, processing, storage and analytics are only the beginning of making data usable and actionable and making IoT projects transformational. AI or cognitive processing engines offer natural language processing (NLP), machine learning (ML), and image and text analytics to enrich IoT apps. IoT data can be used to understand current conditions and trends, comprehend unstructured data from videos and images, and mine unstructured textual data for insights. Advanced AI capabilities also enhance the levels of automation offered to industrial IoT, increasing the flexibility of manufacturing processes and safeguarding against unwanted costs.
While traditional relational databases will continue to hold their place at the heart of most business systems, they face competition in new IoT application domains. Many organizations use open source frameworks to process Big Data and a crop of high-performance time series databases are emerging.
Much of the hard-core computer processing behind IoT apps is orchestrated in the cloud. However, latency and reliability factors mean that not all computing functions that are time critical can be done in the cloud; some have to be performed in onboard computers within devices. The anti-collision measures within Tesla’s AutoPilot is one example. A wave of current technology cycles that depend on low latency, the delay of a data transfer following the instruction, and high reliability, such as artificial intelligence, autonomous vehicles or augmented reality, is pushing more raw computing power to the “edge of the network”, into endpoint devices like cars or iPhones. This means the results might have micro-data centres within devices. The move to “edge computing” could have two big consequences: software companies could need to exert more control over the hardware they run on and demand forhigh-endd processors could rise.
Growth of data centres
The compounding demands of the Internet of Things, cloud computing, internet TV, gaming, artificial intelligence on digital lives are leading to a bottleneck in the data centre market which the Internet giants are rushing to fill as well as telecom operators and large corporations, building data centres across the world at a fast clip. These data centres take in, store, process and disseminate the explosive, indeed, exponential, growth in internet data that form the raw material and brains of their operations.
Internet and cloud infrastructure
IoT is becoming the next battleground in cloud computing. Various IoT-specific cloud services have been launched to enable fast and efficient data storage and processing in the cloud, mainly on infrastructure-as-a-service (IaaS), but also on platform-as-a-service (PaaS) solutions. Vendors are increasingly looking to verticalise these to attract industry-specific workloads. In the retail sector, for example, internet infrastructure solutions are emerging that provide dedicated connections to service providers’ data centres to ensure data protection compliance, and to provide solutions for workloads such as analysing customer behaviour, optimizing supply chains and monitoring the temperature of fridges in stores.
Security presents one of the most critical obstacles to IoT deployment. However, in providing security solutions, suppliers have had trouble going beyond their traditional domains. For example, operators’ IoT security offers have mostly been about device authentication and network reliability. Clearly, breaches can occur at the device level, network level, app level, storage level and data level. There is some work in progress to help vendors and operators come together.
For many years, the data centres that power the IoT have been moving away from hard disk drives (HDD) as their primary storage devices to solid state drives (SSD), because SSDs are silent, faster and more reliable than HDDs. More than half the memory in data centres globally is probably SSD now.
Artificial intelligence is becoming more and more important within the IoT. AI comes in many forms, such as voice recognition, facial recognition, recommendation engines, gesture control, but the most important is machine learning, which refers to machines predicting outcomes by analysing large data sets to learn from successes and failures. There are two aspects to machine learning:
- training an artificial neural network with massive amounts of sample data, and
- making inferences about new data samples based on the trained network.
Most training occurs in large data centres, typically using graphics processing units (GPUs), which are able to handle large ‘throughputs’ better than any other type of chip, because they have massively parallel architectures consisting of thousands of smaller, more efficient cores designed for handling multiple tasks simultaneously.
Field programmable gate arrays (FPGAs), or programmable chips, tend to be better at inference because they can reconfigure the hardware as the algorithms evolve and also provide very low latencies.
What is the history of the Internet of Things?
Connected traffic lights emerged in 1985, though some academics see 2009 as the year when IoT was born since it was the year in which the number of connected devices exceeded the number of people in the world. Today, there are billions of connected devices. Three-quarters of these are either mobile phones or PCs. Tomorrow, many of them will be physical “things” such as cars, fridges, clothes, watches and heart monitors.
The story of the Internet of Things so far…
- 1995: Siemens funded the development of “M1”, a GSM-standard data module for machine-to-machine (M2M) industry applications.
- 1999: Procter & Gamble’s Kevin Ashton first coined the term, “Internet of Things”, referring to the link between radio-frequency identification (RFID) technology in P&G’s supply chain and the Internet.
- 2000: LG announces the world’s first internet-connected fridge.
- 2003: BigBelly Solar launched one of the world’s first connected “things”, a smart, solar-powered trashcan which could send notifications over the Internet when it was full.
- 2005: The United Nations publishes its first report on the IoT via its agency, the International Telecommunication Union.
- 2008: The IPSO alliance was founded to promote the use of IP in connected devices.
- 2009: The number of connected devices exceeds the number of people on earth.
- 2011: Nest Labs launched the Nest Learning Thermostat, which intelligently controls the temperature of a home, based on sensor algorithms, ML and cloud computing technologies.
- 2013: Qualcomm founded the AllSeen Alliance and a few months later Intel set up the Open Internet Consortium. Both standards bodies were set up to create rival communications protocols for IoT. Google acquires Nest Labs.
- 2014: Google launches Google Glass and Apple launches the Apple Watch and the Apple Homekit.
- 2016: GE announces its Predix IoT platform.
- 2017: Narrowband IoT (NBIOT) and Long Range low power wireless platforms begin to gain traction.
- 2018: The rollout of 5G begins, alongside national LPWAN initiatives.
- 2020: LPWAN in widespread use, especially in urban environments.
- 2025: Near ubiquitous connectivity for IoT devices lowers the cost of operation to below 1c/day.
This article was produced in association with GlobalData Thematic research. More details here about how to access in-depth reports and detailed thematic scorecard rankings.