Big data has emerged in response to the exponential growth in data. This in itself is the result of a combination of technology trends. These include (but are not limited to) the ubiquity of mobile devices, widespread use of social media, and the rise of the Internet of Things (IoT).
Leading big data technology trends
Listed below are the leading big data technology trends, as identified by GlobalData.
Specific use cases of edge computing. Where more data processing is done at the edge of the network, nearer to the data source. It includes the maintenance of data processing and analytics close to points of collection. The growth of edge computing is therefore closely associated with the IoT. The proliferation of enterprise IoT initiatives and consumer IoT offerings (such as automated home devices) will drive demand for edge computing solutions. The deployment of 5G cellular technologies will be a major stimulus for both IoT and edge computing.
The race to reach quantum supremacy – the point at which a quantum computer can carry out calculations faster than a classical computer ever could – is well underway. Google, IBM, and Microsoft are leading the pack. IBM unveiled the first quantum computer designed for commercial use, the Q System One, in March 2019. AI, and particularly machine learning, will benefit. Quantum computers will complete extremely complex calculations, involving large data sets in a fraction of the time. For effort-intensive AI chores such as classification, regression, and clustering, quantum computing opens a new realm of performance and scale.
Central processing units (CPUs) have powered data centres for decades, but new workloads stemming from technology such as AI and the IoT are pushing CPU architectures to their limits. Graphics processing units (GPUs) that were once primarily used for gaming can process many threads in parallel, making them ideal for training and modelling of large predictive data models. As the criteria for data centres moves from calculation speed to search speed, GPUs are moving into data centres. However, while GPUs are ideally suited to training neural networks, field programmable gate arrays (FPGAs) show signs of being better at execution.
Data centre interconnect (DCI)
As more data centres come on line around the world, the need to transfer data between them at increasingly high speeds also grows. As a result, the DCI market faces huge demand for ever-faster optical links and transceivers. These data transfer speeds are especially important at the DCI level since 70% of all data centre traffic is eastwest traffic (meaning it is inside the data centre), and therefore has a marked effect on the overall speed of the data centre. The primary use case of DCI remains connecting data centres, but has recently expanded to include capacity boosting.
Silicon photonics is an emerging technology that combines laser and silicon technology on the same chip. It allows data to be transferred between computer chips by optical rays. This supports faster interconnects between data centres. Photonic chip technology is still in beta development. Cisco (which has bolstered its capabilities around silicon photonics with the acquisitions of Luxtera and Acacia Communications in 2019), Intel, and Inphi are prominent vendors in this market.
Companies that have outsourced their servers to cloud infrastructure as a service (IaaS) companies such as Amazon Web Services (AWS) typically pay in advance for the amount of server capacity they require for process execution. However, IaaS is fast being replaced by serverless computing, whereby the cloud provider dynamically manages the allocation of code execution resources. Pricing is based on the actual amount of resources consumed by an application, rather than on pre-purchased units of capacity. Serverless architectures enable developers to outsource the hardware and focus on developing value-adding code. AWS Lambda is one of the leading serverless computing offerings. Its charges are based on the compute time consumed.
High performance computing (HPC)
HPC is one of the fastest-growing segments of the computing hardware market. China dominates this segment, thereby supporting AI, space, defence, industrial design, gaming, and genomics industries. The unveiling of the Sunway TaihuLight System in 2016 was a genuine breakthrough. Not only did it outperform the world’s previous fastest supercomputer, but it did so with Chinese-made processors and interconnect technologies. Meanwhile, Huawei, Lenovo, and Inspur are aggressively competing against IBM and HPE in the enterprise and cloud transaction processing space.
A handy by-product from optical interconnect technology are 3D sensors, which will witness phenomenal growth because of themes such as augmented reality (AR), AI, and autonomous vehicles. In some AR systems, advanced 3D-sensing cameras use vertical-cavity surface-emitting laser (VCSEL) systems for tracking objects and sensing 3D depth. These laser-based sensors, together with optical filters, offer more accurate autofocus functionality for cameras. This means that leading suppliers of VCSEL laser systems could get a significant boost over the next year or so. Many of these suppliers, such as Finisar and Lumentum, come from the optical networking equipment sector. Other players include Infineon, AMS, STMicroelectronics, Largan, and LG Innotek.
Software defined networking (SDN)
SDN is an architecture for data networks that allows software, rather than hardware, to control the network path along which data packets flow. The implication is that SDNs transfer the intelligence currently held in a network equipment box to a software layer, enabling the network to be centrally controlled and programmed. It is hugely disruptive because it fundamentally changes who controls the data centre. More than two-thirds of data centres will be adopting SDN fully or partially by 2021, according to Cisco’s 2018 Global Cloud Index report.
3 Things That Will Change the World Today
On the open source software front, the most important development is the arrival of operating system container software. Containers package a software application together with its dependencies and can be run on most Linux or Windows servers. They enable applications to be easily moved between different IT infrastructures. The key benefits of container technology include significant cost savings, reduced time to deployment, better scalability, and flexibility to port to other infrastructures. Leaders in container management software include Docker’s Swarm, Google’s open-sourced Kubernetes, Red Hat’s OpenShift, and Amazon’s Blox.
The combination of streaming data and analytics has the potential to generate value for companies. A 2018 study by Harvard Business Review Analytic Services, in collaboration with SAS, Intel, and Accenture, revealed that 70% of enterprises have increased spending on real-time customer analytics solutions. Amazon, Microsoft, and Google are all pushing their real-time analytics offerings through the cloud to industries such as telecoms, transportation, and healthcare.
The full-scale mainstream adoption of 5G has the potential to increase data consumption globally. 5G is expected to enable faster speeds and connect around one million devices per square kilometer. GlobalData estimates that, by 2024, more than one-quarter of all data traffic will be carried over 5G, up from less than 1% in 2019. 5G will also impact applications that rely on real-time data analytics, such as autonomous vehicles.
This is an edited extract from the Big data – Thematic Research report produced by GlobalData Thematic Research.
Latest reports from
Or to search over 50,000 other reports please visit
GlobalData Report Store
GlobalData is this website’s parent business intelligence company.