Multi-story buildings are nothing new. People have lived in tower blocks since the days of the Roman Empire, but we have been reluctant to house our servers there - until now.

More than 2000 years ago, apartment buildings or “insulae” in the Roman Empire reached seven stories high, and may have gone as high as ten - although they were unstable, and rich residents chose the lower floors rather than the penthouse suites.

Steel construction allowed buildings to break the ten-story limit in the 1880s, and cities round the world have not looked back. Here is a way to get more people and businesses into the productive hubs of humanity.

Despite this, most data center facilities have resolutely remained at street level. The majority are in single-story constructions.

There are several obvious reasons for this. These facilities rely on heavy duty electrical and air-conditioning equipment; lifting that up to a higher floor is an effort to be avoided, if possible. Similarly, it is easier to lay on supplies of water, diesel fuel and electricity at ground level.

multistory.jpg
– Chris Perrins/DCD

People live higher up because they need or want to be closer together, and because land is expensive in the city. Many data centers have the option to live out of town, thanks to the fast communications networks they use. Like a teleworking executive, they can choose to live in comparative luxury.

But that is changing. Some customers require low latency for applications used by city workers, so their servers have to be in town. There’s a substantial history of multi-story data centers in locations like London, New York and Paris. And on a small urban island like Singapore, there will soon literally be no option but to build upwards.

And even out-of-town network hubs are starting to get over-crowded. Loudoun County in Northern Virginia became a huge nexus for data centers, because there was massive network bandwidth located in a region known for sprawling, sparsely populated farms.

New York

Skyscrapers have been a part of New York’s skyline for more than 100 years, and if you are putting a facility in Manhattan, you won’t have the luxury of a ground floor location. You are also unlikely to have a new build.

Intergate Manhattan, at 375 Pearl Street, has been repurposed multiple times. The 32-story building, which casts its shadow on Brooklyn Bridge, was originally owned by New York Telephone, followed by Bell Atlantic and then Verizon. Sabey Data Centers bought it in 2011, but it’s still frequently referred to as the “Verizon building,” because Verizon still has three floors in the facility and pays naming rights to keep its signage at the top of the tower.

Each floor is 14 to 23 feet high, which is unusually tall compared to most high rises. Had it been a residential block, Intergate would have been around fifty stories tall.

When Sabey took over the facility, it planned to develop 1.1 million square feet of data center space, but according to Sabey vice president Dan Melzer, “the market just wasn’t ready for that.”

Instead, the company fitted out 450,000 square feet of technical space, and the remainder was repurposed as office space. This was only possible in the upper half of the building, however, as half of the space has no natural light, built with telephone equipment in mind.

While some tenants get the full service, including power, cooling and managed racks, one customer in particular - which, incidentally, operates the New York subway wireless system - only leases power and water from the company and manages its own facilities.

Underneath the data center floors are the chillers and generators needed to run them – including an 18MW substation to support one of the company’s so-called turnkey customers.

The building has a massive freight lift to carry heavy equipment up and down the tower, and its data center features extend underground, beneath the reception and loading dock floor, to a tank holding enough fuel to run the servers for 72 hours in an emergency: “We have the capacity for 180,000 gallons of fuel, though we don’t need that much right now,” Melzer said, as well as 270,000 gallons of water. By law, “since 9/11, you know,” all fuel across all buildings in the city must be stored on the lowest floor.

The company shares its fuel resources with its trusted neighbor – the New York Police Department (NYPD), whose headquarters are adjacent to the tower. On the very top of the building are the company’s cooling towers and cooling plant, and an antenna for wireless carriers.

GettyImages-476709346.jpg
42nd Street, New York – Getty Images

Less than a mile away, 60 Hudson Street is one of downtown Manhattan’s architectural marvels, again repurposed multiple times since its days as the headquarters of telegraph company Western Union. The company relied on it being near the AT&T’s building – a similar-looking monolith of a tower - for its communication lines. Built in what was a residential neighborhood in the 1920s, the building’s wide, square plinth of a structure was engineered to support heavy equipment, and its floors are linked by pneumatic communication tubes. “The building sort of recycled itself for today’s age,” explained Dan Cohen, senior sales engineer for current resident Digital Realty. The floors hold heavy racks with ease, and “a lot of those pneumatic tubes are now being used for fiber.”

Colocation within 60 Hudson dates back to Telx, a company which started out offering connections for telephone “calling cards” in the 1990s. Telx began leasing space in the building in 1997, to offer a neutral interconnection facility, and expanded into other suites, taking over Datagryd in 2013.

Inside 60 Hudson, Telx “basically invented the concept of a meet-me-area,” Cohen said.

“Instead of having various types of type 2 circuits, we thought: ‘Why don’t we create a meeting room where carriers can meet customers?’” And the rest is history.

Digital Realty, one of the world’s most successful colocation providers, bought Telx in 2015, giving it control of the fifth, ninth, eleventh and twenty-third floor; the walls are owned by a real estate firm. The colocation provider’s scope for expansion isn’t clear. “I would hope,” he said, but “they don’t really tell us much.”

60 Hudson houses 13,000 cross-connects – out of an estimated 25,000 to 30,000 in New York. Most nearby submarine cable traffic passes through the building and, across multiple North American carrier u uhotels, Digital Realty processes 70 percent of all Internet traffic.

DCD visited the building and, on the ninth floor alone, Digital has four meet-me-rooms; it is visibly jam-packed full.

Though for the most part everything has moved onto fiber, this is a place of relics: “DS1 circuits, DS3, T3, T1 circuits, all still active.”

Digital Realty repurposed the fifth floor in 2014 - after passing on it several years prior, out of concern that there wouldn’t be sufficient demand to fill it - “then it became more expensive later on and we kicked ourselves… But it was still worth it.”

Though it appears empty, the floor is largely leased already, as customers tend to pay for power capacity rather than space, and they are commonly overzealous in planning for future needs. “We can’t sell power that we don’t have, right?”

Diverse connectivity and fiber serve the floor, as well as DR's in-house fiber to connect latency-conscious carriers with their customers.

GettyImages-505291058.jpg
Aerial view of London – Getty Images

London

Imagine a data center as a gigantic computer. Most are big flat boxes, gobbling up land like old-fashioned PCs swallow desk space. Telehouse North 2 is different. With six data floors, it’s like a space-saving “tower system” that could tuck neatly on the floor.

The building sits by a Docklands roundabout, at the gateway to the City of London, so there is an incentive to make best use of that land - which led Telehouse to build its tower. The building has adiabatic cooling units from Excool installed on each of the six data floors.

The tricky part was to make sure that the cooling systems removed heat from each floor, but did not interfere with each other. “You have to segregate the cool air coming in and ensure you get the hot air away,” building manager Paul Sharp told DCD on a tour.

Telehouse enlisted the help of Cundalls and used computational fluid dynamics (CFD) to design the building, which draws in cool air on one side and expels warm air on another side, using the prevailing wind direction to remove the heat.

The air is fed in through a single six-story space which runs up the whole side of the building. There are grilled floors at each level, but stepping onto them is a vertiginous experience.

In common with many other contemporary data centers, North 2 uses a hard floor rather than a raised one, with contained hot aisles at around 38ºC, while a slow flow of air at 25ºC is drawn through the room.

The efficiency of the design won Telehouse a DCD award in 2017.

GettyImages-1038201928.jpg
Singapore skyline at night – Getty Images

Singapore

Facebook is planning an 11-story data center in Singapore, at a cost of $1bn. That’s no surprise - it’s a major economic hub, on a tiny island with eye-watering real estate prices. The only surprise is that it’s taken this long to reach this stage.

Singapore has a tropical climate to contend with, and Facebook is using some pretty impressive new technologies to achieve its towering ambition.

The first deployment of the new StatePoint Liquid Cooling system, developed with Nortek, is expected to allow an annual Power Usage Effectiveness (PUE) of 1.19.

The new facility will have 170,000 sq m (1.8m sq ft) of space and use 150MW of power, making it Singapore’s biggest single data center, as well as its tallest. The building’s façade will be made of a perforated lightweight material which allows air flow to the mechanical equipment inside.

And Singapore won’t stop there: in 2017, the government called for 20-story facilities.

Huawei and Keppel Data Centers have joined the Info-communications Media Development Authority of Singapore (IMDA) to carry out a feasibility study on a 'green' high-rise data center building.

Details are very scarce, but Ed Ansett, co-founder and chairman of i3 Solutions Group, says a high-rise data center can result in a better PUE: “It certainly can do. The high-rise facility can achieve an exceptionally low PUE provided it adopts a different approach to cooling.”

GettyImages-579410918.jpg
Victoria Harbour, Hong Kong – Getty Images

Hong Kong

It’s no surprise to find data centers building upwards in Hong Kong, a city where a shortage of real estate famously caused Google to halt operations on a data center in 2012. Both AWS and Google are building there now, despite the chronic shortage of land in the city.

Local players understood the nature of the real estate market well before then. Back in 2001, SUNeVision opened the MEGA-i building, a 30-story purpose-built data center, which is designed to house over 4,000 racks with total gross floor space of more than 350,000 square feet.

It is described as “one of the largest Tier 3+ Internet service centre buildings in the world” - although it doesn’t feature on the Uptime list of certified Tier III facilities.

The facility sells carrier-neutral retail colocation, marketed under the iAdvantage brand, with customers able to choose from a variety of power and deployment options, from open farm and private suites to customized areas. Tenants include global telecoms carriers, cloud providers, multinationals and local enterprises.

It is currently being optimized to keep the technology up to date, while meeting the ever-increasing regional and global demand.

qts ashburn opening this summer
QTS' three story data center in Ashburn – QTS

Northern Virginia

There’s plenty of space in Northern Virginia, and when the data center operators first came here, they staked their claims and built out, somewhat like the farmers that first settled the area in the 18th century.

But even these wide open spaces have limits, and as prices go up - reputedly, to more than $1 million per acre - it’s time to make better use of the land.

QTS has opened the first three-story data center in Northern Virginia. The shell has been built, and will ultimately have four 2.5MW halls on each floor, but it is being filled on a segment-by-segment basis. QTS has sold one ground-level hall to a wholesale customer, and plans to fill the halls above it, before moving on to the other quadrants.

The halls are being filled in this order because cooling for the three floors relies on convection over the three stories so they have to be built at the same time. “It’s not capital-efficient to build it all at once,” Tag Greason, chief hyperscale officer at QTS, told DCD.

"Northern Virginia invested early to build an ecosystem of power, tax incentives, and domestic and trans-Atlantic connectivity that continues to attract the who’s who of data center operators," added Christopher McLean, director of mission critical solutions for construction firm M.C. Dean.

"And it continues to do so despite the rising cost of land. Land that has tripled in value in just three years! Even at prices closing to $2 million per acre, more data center space was absorbed in Northern Virginia in 2018 than the sum of the next five major US markets in the same period. In order to capitalize on those amenities, coupled with the scarcity of developable land, the market is driving vertical growth to scale with power and cooling density. What was 10-years ago a sea of sprawling, single story, window-less boxes, is now 2- to 4-story glass facades, and ivy or other green covered appurtenances."

Amsterdam at night
Amsterdam at night – Getty Images

Amsterdam

A $190 million eight-story Equinix data center towers over Amsterdam’s Science Park. AM4 opened in 2017 and is being billed as an “edge” facility, thanks to its urban location. It is 70m (230 ft) tall, and will have room for 1,550 cabinets in its first phase and 4,200 when fully built out, with a usable floor space of 124,000 square feet (11,500 sq m).

Building such a tall facility in Amsterdam’s humid soil required very deep pilings to be planted, right next to university land.

The space constraints also required Equinix to dig a large moat around the facility and the adjacent Equinix data centers.

This article featured in the October/November issue of DCD>Magazine. For more information, or if you'd like to subscribe for free, click here or fill in the form below: