Data centers are where the world’s data is processed, organized and stored.

Many data centers are located in buildings similar to warehouses. They are functional buildings that require little physical access - though some are more aesthetically pleasing than others.

Because they tend to be large, they are often built in vast open spaces, where land is cheap, like the US state of Nevada, but as we will explore further below - and in other articles - factors like proximity to other operators and network providers as well as access to cheap energy and government subsidies can play a part.

Any size you like

Strictly speaking, a data center can be as small as a cupboard in an office unit, but typically they are larger, contain thousands of servers, or computers, connected to one another - and often, but not always, to the Internet - via physical or digital networks.

One of the world’s largest data center campuses, The Citadel, located in the Tahoe Reno Industrial Center, the world's largest business park, in the US state of Nevada, is owned by a company called Switch.

At full build out, Citadel will span almost eight million square feet (that’s just over 740,000 square meters) - though the scale and value of one or several data centers is more often measured by its capacity in terms of electrical consumption, as that determines how much IT equipment it can accommodate. In The Citadel’s case, that’s 650MW - at least once it is fully developed. Should that happen, the site will be the world's largest data center - but currently, that title belongs to Facebook/Meta.

The backbone of the Internet

Governments and banks use data centers to store and exchange data securely; research organizations use them to crunch, analyze and interpret huge amounts of information; gaming companies rely on data centers to render graphics and allow users to play online around the globe.

The military relies on data centers to power weapons systems; current advances in artificial intelligence are possible because the infrastructure is there to process the masses of data required to map out and resolve complex problems.

Whenever you make a web query (i.e., type in the name of a website into a browser, use a search engine, press a link, scan a QR code), you are interacting, either directly or indirectly, with a data center.

The cryptocurrency sector relies on the same digital infrastructure to carry out mining. Though they are often called server farms, or crypto farms, they are essentially data centers, equipped with high-performance computers.

Providing one has access to the networks they are connected to, information stored in data centers can be accessed from anywhere.

How close you are to any given data center, as well as what networks it is connected to, affects the time it takes for information to travel between different nodes of the network - this is known as latency.

Inside a data center


Arguably the most important room in the building, the server room is where all of the computers are kept - in large, vertical racks, connected to the Internet via Ethernet connections.

A standard rack enclosure is 19 inches wide and normally contains up to 42 server units - shortened to U to describe the size of a server unit, usually 1U or 2U. As well as servers, racks contain storage systems, networking equipment such as switches and routers, cable management accessories, power distribution units (PDUs) and airflow management systems.


Just like home computers, servers generate a lot of heat, which can become problematic in rooms full of them, which is why data center operators try to keep server rooms as cool as possible using computer room air conditioning (CRAC) units; laying out racks in hot and cold aisle configurations, with cold air intakes facing one way and hot air exhausts facing the other; and in-rack-cooling units, which rely on air circulation or liquid immersion technologies.

Maintaining particular humidity levels and cleanliness within the racks is important too, as an accumulation of dust and dirt can affect equipment reliability, potentially causing short circuits.


Most data centers are powered by the grid, with energy fed to power distribution units (PDUs) within the data center to keep the supply to the equipment stable, as any loss of power can result in downtime - otherwise known as an outage - which in turn risks hardware damage, data loss and disruption to whatever operation the equipment is performing.

For the same reason, data centers are equipped with backup power systems such as generators and uninterrupted power supplies (UPS).

As the cost of power - financial and environmental - continues to rise, companies are increasingly seeking more energy-efficient and less carbon intensive ways of powering their data centers, such as solar or wind warms, or making other efficiencies, like using outside air to cool IT equipment, or investing in less power-draining hardware.

Backup and recovery

Businesses typically use multiple data centers, with a back-up location, known as a ‘disaster recovery’ site, which store a copy of their data on separate servers to avoid losing information in the case of, say, a power cut, or a fire at the primary site.

These are at a different location, far enough away to avoid being struck by the same disaster at the same time, but close enough that they can be physically connected to the other site.


Access to data centers is often limited by tight security measures, as they contain expensive equipment and sometimes sensitive or proprietary data.

Physical security might mean surveillance cameras, guards, and passcode or even biometric access only. Virtual security, to protect against cyberattacks or other attempts to hack into or jeopardize the data contained within a facility, is software-based.

Distributed-denial-of-service (DDoS) is a common attack on data center systems, whereby an attacker floods them with traffic in order to disable them and bring a targeted website or service offline.

Whose systems are targeted in an attack decides how serious an impact it has: in 2016, a major DDoS attack on security firm Dyn’s servers caused major disruption to the widely-used Amazon Web Services (AWS), Spotify, and Twitter.

Such attacks can be state-sponsored, too, revealing the necessity for robust mitigation systems for the sake of national security. Before Russia invaded Ukraine in March 2022, the country’s largest commercial bank and defense ministry websites were both hit by a DDoS attack - though neither were officially attributed to any particular group or nation.

Fire suppression

To prevent serious data loss, data centers are equipped with fire suppression systems within and outside of server rooms. Conventional foam or water-based fire extinguishers conduct electricity, so operators favor gas suppression systems, which cause minimal damage to IT systems.

Facebook DeKalb, Illinois
– Facebook

Different types of data centers


This is when companies decide to store, manage and protect their own IT equipment.

Counterintuitively, this doesn’t necessarily mean the data center is located, say, at a company’s headquarters or in their offices, but that they own the equipment.


Some companies choose to place their IT systems in facilities shared with other businesses, outsourcing the day-to-day running of the site, like maintenance and security to a third party.

The hardware is their own, but power, cooling, and connectivity to network service providers is usually ensured by the colocation provider.


A form of colocation where the provider only leases large blocks, or the entirety, of a facility to a single customer. There is no industry standard for when a colocation facility becomes 'wholesale.'


Cloud data centers are different to colocation sites in that customer data is stored on systems belonging to the cloud provider, and the information is managed virtually rather than each server being allocated to a particular client.

For businesses wishing to store, process and back up their data with minimal input, often only paying for what they use, cloud computing is often the preferred option.


Hyperscale data centers are owned or leased by companies with massive data processing requirements, either because they oversee massive chunks of the Internet themselves, or because they manage technology systems for third parties. Think Facebook/Meta, Google, Apple, Amazon, IBM, and Microsoft, to name a few.

In hyperscale facilities, the entire building is typically given over to monolithic applications under one company's control - say Google's search, Meta's social media, Amazon's AWS cloud. This makes for major economies of scale, and allows hyperscale operators to push the technology boundaries and experiment with new approaches.


As connectivity has become a staple of modern living, from smartphone use to the instant data transmission required to power driverless cars or operate city-wide facial recognition systems, some companies are seeking to bring processing power closer to where it is needed, to ‘the network edge,' as it is known, to reduce the time it takes for data to travel to a facility and back to where it is needed - which, again, is known as latency.


The precursors of data centers were the first large computers, starting with developments in the UK, and the U.S. Army Ballistic Research Laboratory’s Electronic Numerical Integrator And Computer (ENIAC), built in the 1940s to calculate artillery codes.

Information processing, storage and encryption were seen as important in the development of defense systems during the Cold War, and were instrumental in the eventual creation of the commercial mainframe.

In the 1950s, transistors replaced vacuum tubes in electronics, culminating in the creation of the first integrated circuit, and the size and number of computers increased rapidly. At first, mainframes held and processed data locally, without the ability to share it across a network like they do today. Plus, the equipment was bulky and expensive, so for those companies that required that their data be stored digitally - with many preferring to keep paper records - storing it on their own systems was not viable, and mainframes were shared.

But as Moore’s Law accurately predicted that the cost of components on an integrated circuit would be reduced proportionately to how many transistors were added to it, the advent of microprocessors of the 1980s and 1990s made processing power (relatively) cheap, so saw more companies purchasing their own hardware.

Solutions like colocation, cloud, and hybrid (a mix of on-premise and cloud computing) came later, as the necessity to scale systems grew, and companies specializing in cloud provision increased in both proficiency and capacity.

Networks enabled computers to be connected across great distances - and paradoxically, this lead to the development of clusters located close to each other.

Computers began to cluster, and data centers emerged. In the 1960s. the first interconnected packet sharing network "Arpanet" was created at the Advanced Research Projects Agency (ARPA), a US government agency based in Arlington, Virginia. From Arpanet, the Internet evolved, and the first Internet exchange MAE-East, was created in 1998, also in Northern Virginia.

As service providers emerged, they benefited from being located close to exchanges. AOL and others located in Ashburn, Loudoun County, which became known as "Data Center Alley" as others piled in. Equinix, the first large colocation provider had large facilities there, and others moved in, encouraged by low taxes and cheap land.

Legal and geopolitical factors

Beyond the commercial driving forces that decide where businesses operate in the data center space, where data is stored and processed has legal - and sometimes political implications which play a major role for the industry. In the EU, for example, it is illegal to store data belonging to the Union’s citizens outside of its borders if this might breach the EU's General Data Protection Regulation.

Elsewhere, like in the US, the law states that the government can demand that companies hand over their data, (or indeed, their customers’ data), should it be deemed relevant to an investigation.

Who owns and operates data centers in any given country can be a source of contention, too: for instance, the People’s Republic of China only allows foreign companies to build and operate data centers there if they partner with a Chinese company.

Because of a change in data residency laws stipulating that iCloud services must be delivered domestically, Apple, who out of concerns for security had long resisted setting up camp in China, recently invested $1bn in building a data center campus there in partnership with local service provider, Guizhou Cloud.

How much is the data center industry worth?

The data center industry is an important part of the world economy, because it generates direct and indirect employment, and is the basis for the technology services industry. The data center real estate market has grown steadily for more than two decades, and was bolstered rather than hindered by the Covid-19 pandemic, when most companies ramped up their digital commerce capabilities and relied heavily on online conferencing systems.

According to technological research and consultancy firm, Gartner, global spending on data center systems will hit $226 billion in 2022, while the demand for data center capacity will reach an all-time high.

Data centers are seen as a benefit to local (and sometimes national) economies, not because they pay a lot of tax - in fact, some US states seek to attract operators with generous tax cuts - or because they directly employ a lot of people (although they can represent fairly lucrative contracts during construction), but because they attract investment to the area.

The data center real estate investment market is huge, and, during the pandemic, as the value of once-lucrative office space fell, grew out of proportion compared with other types of real estate.

The buying and selling of data centers can reach astronomical sums: last year, listed real estate investment trust (REIT) American Tower bought up CoreSite’s portfolio of 25 US data centers for $10.1 billion.

Where are the data centers?

The US has more data centers than any other country by a long way, concentrated in areas such as New York, northern Virginia, and Atlanta on the East Coast; San Francisco and Los Angeles on the West Coast, and Chicago in the Midwest. Real estate firm CBRE reckons there's around 3,350MW in use in primary markets (with more capacity in lesser hubs). There's another 730MW under construction, according to CBRE.

In Europe the Middle East and Africa (EMEA), data centers tend to be concentrated where the Internet Exchanges are, the biggest markets being Frankfurt, London, Amsterdam, and Paris (known as the FLAPs), with total supply predicted to reach 2,200MW in 2022. Within this region, Africa is the smallest part, but is expanding rapidly.

In the Asia and Pacific (APAC) region, China, Australia and Japan lead the way in terms of the number of data centers located there, and the market is thriving: according to commercial real estate company CBRE, in 2021, direct investment in data centers in the region totaled $4.8 billion, a 100 percent increase on the previous year. CBRE reckons there's around 2000MW online in APAC, with some markets set to nearly double in the next couple of years.

The data center market in Latin America (LATAM) is considerably smaller, but growing. Analysts are less likely to break out specific figures for the region, but estimates put the current size at around 200MW, growing at seven percent per year.

Subscribe to our daily newsletters