search
close-icon
Data Centers
PlatformDIGITAL®
Partners
Expertise and resources
About
Language
Login
banner
Article

A Brief History of Data Centers

In a world of countless digital interactions, it’s easy to take seamless connectivity between people, places, and things for granted. Yet, it’s the data centre that makes our modern world possible by enabling all the connectivity, processing, and storage we depend on day-to-day.

Of course, data centers haven’t always been the sleek, efficient facilities we know and love. With that in mind, let’s take a look at where data centers began, how they’ve evolved, and where they’re headed next.

The dawn of data

In the 1950s and 60s, data centers were a far cry from their modern cousins. In fact, they weren’t even called data centers, but mainframes.

The CDC 6600, from Control Data Corporation, is often recalled as first supercomputer and boasted a mighty processing speed of 40MHz. Costing the earth and custom-built for specific business uses, these ‘Big Iron’ computers were scarce, fickle, and labour-intensive; keeping them operational for even days at a time was something of an achievement.

With no network connectivity, these early mainframes were islands of computing power in a pen-and-paper world. Here’s how Pitt Turner, Executive Director of the Uptime Institute, recalls the nearby mainframe for a large regional bank: “In the evening, all these trucks would arrive…carrying paper. Through the night that paper would be processed, the data would be crunched, new printouts would be created, and then they would send the documents back out to the branch banks so they could open in the morning”.

Throughout the 1970s and 80s, Moore’s Law continued to thunder on: computing power climbed ever higher and desktop computers became a common sight. Yet, mainframe evolution during this time wasn’t primarily concerned with processing power and efficiency, but reliability. The ability to ensure data purity and avoid corruption steadily increased, but computing power continued to be costly to manage, causing many organizations to outsource their requirements rather than maintain in-house ‘machine rooms’.

Are you being served?

In the 1990s, everything changed. A perfect storm washed away traditional mainframes as the world witnessed the microprocessor boom, the birth of the Internet and the development of client-server computing models.

Suddenly, IT became nimble: delays and bureaucracy gave way to the ability to provision and install business applications much more rapidly on relatively inexpensive hardware. Old mainframe rooms filled up with microprocessor computers acting as servers, laying the foundation for the first in-house data centers. Slowly, this infrastructure became standardised in both design and operation, and the modular racks we know today were born.

Things were changing outside the enterprise as well. As a permanent presence on the Internet became essential, network connectivity and colocation services were suddenly business-critical. Internet providers and hosting companies began building large, external facilities to provision their services, igniting a feeding frenzy of data centre adoption.

Boom and bust

As the Internet matured in the early noughties, data centers took centre stage. IT investment skyrocketed and new facilities shot up around the world as everyone looked to cash in on the dotcom boom.

When the bubble finally burst, the data centre industry was decimated: 17 of the 27 pan-European players went out of business. However, the downturn also kick-started a quieter revolution: virtualization.

In these tough times, hardware utilization, power, cooling, and cost-efficiency were the order of the day, allowing organizations to reduce their data centre footprint and lower both CapEx and OpEx. Virtualization’s impact was dramatic: ultimately reducing data centre power, space and cooling requirements by around 80%.

With the 2008 financial crisis, the drive to reduce IT spending, outsource requirements and harness the potential of economies of scale strengthened its grip. The colocation market saw runaway success, which continues to this day.

Looking forward to the future

Thriving trends like cloud computing, the Internet of Things and the emerging field of Cyber Physical Systems (also known as Artificial Intelligence) will continue to put the data centre at the heart of the digital economy.

To meet stringent performance, reliability, and security demands, organizations are increasingly choosing to abandon on-premise data centre strategies in favour of colocation. Today’s colocation facilities harness all the connectivity, sustainability, efficiency, resilience and expertise that’s been so hard-won over the last half century. It’s no surprise then that business is booming; according to Research and Markets, the colocation industry is accelerating towards a total value of US$55.31 billion by the end of 2021.

Of course, further change is inevitable. No one knows what the future holds, but state-of-the-art colocation facilities offer organizations the best chance to be ready for it.

Tags