Big data, and its associated computing and storage needs, has had a significant impact on the data centre industry. The data centre, or bit barn, is an organisation’s brain. It takes in, stores, analyses, and disseminates all the data the organisation needs. As the amount of data generated has exploded in recent years, so data centres have been forced to evolve and grow.
History of big data: before 1980, companies used mainframe computers to store and analyse data. Two key technologies were critical to the first formations of data centers as we think of them today. Both occurred in the early 1980s; the first was the advent of personal computers (PCs), which proliferated as Microsoft’s Windows operating software became the global standard. The second was the development of the network system protocol by Sun Microsystems, which enabled PC users to access network files. Thereafter, microcomputers begin to fill out mainframe rooms as servers, and the rooms become known as data centers.
A number of critical milestones since then changed the evolution of the industry, including the emergence of virtualisation software and the shift towards cloud computing. These and other major milestones in the journey of the big data theme are set out in the timeline below.
The big data story
How did this theme get here and where is it going?
1943 – The UK developed the first data-processing machine to decipher Nazi codes.
1945 – ENIAC, the first electronic general purpose computer, was completed.
1954 – The first fully transistorised computer used all transistors and diodes and no vacuum tubes.
1964 – The IBM System/360 family of mainframe computer systems was launched.
1971 – Intel’s 4004 became the first general purpose programmable processor.
1973 – Xerox unveiled the first desktop system to include a graphical user interface and internal memory storage.
1977 – ARCnet introduced the first LAN at Chase Manhattan Bank, connecting 255 computers.
1981 – The PC era began.
1983 – IBM released its first commercially available relational database, DB2.
1989 – Implementation of the Python programming language began.
1998 – Carlo Strozzi developed NoSQL, an open-source relational database.
1999 – VMware began selling VMware Workstation, allowing users to set up virtual machines.
2002 – Amazon Web Services (AWS) launched as a free service.
2006 – AWS started offering web-based computing infrastructure services, now known as cloud computing.
2007 – Apple launched the first iPhone, creating the mobile internet as we know it today.
2010 – The first solutions for 100 Gigabit Ethernet were introduced.
2011 – Facebook launched the Open Compute Project to share specifications for energy efficient data centers.
2013 – Docker introduced open-source OS container software.
2015 – Google and Microsoft lead massive build outs of data centers.
2017 – Huawei and Tencent joined Alibaba in major data centre build-outs in China.
2018 – Leading data center operators started the migration to 400G data speeds.
2018 – Silicon photonics technology started to positively impact data center networking architectures.
2020 – Edge computing will revise the role of the cloud in key sectors of the economy.
2021 – Data center speeds expected to exceed 1,000G.
2025 – Data centers will be increasingly on-device.
This is an edited extract from the Big data – Thematic Research report produced by GlobalData Thematic Research.