Mega projects have a bad image. Today we tend to think that small is better, and want things to be less global and more local, but it looks as if giant data centers are here to stay.

Mega projects are usually defined as ones costing more than US$1bn; they attract public attention because of their impact on communities and the environment. The European Cooperation in Science and Technology categorizes mega projects as having “extreme complexity and a long record of poor delivery.” 

mega projects
– Sam Falconer @ debut art

A backwards step?

So there we are – already talking failure. Right now, data centers are getting bigger – and quickly. We are producing, gathering and analyzing ever bigger data sets, and keeping them for longer, creating a need for vast storage. But, in many ways, our bigger data centers feel like a move backwards, towards the era of huge mainframes housed in air-conditioned buildings.

Large enterprises face economic problems that are forcing them to consider what for many would have been unthinkable earlier – outsourcing their IT. Massive cloud companies such as Amazon Web Services (AWS), Facebook and Google have emerged with mega data centers that can, and will, do that job. 

According to Jeff Paschke at 451 Research: “Four of the largest-scale internet firms – Amazon, Apple, Google and Microsoft – continue to invest heavily in building out mega data centers globally, with capital spending at the four companies over the past two years totaling more than $56bn, according to publicly filed earnings releases. In 2014, capex spending at the four companies increased 31 percent over 2013 spending, part of a continuing trend of increased spending on data centers and their infrastructure.”

The most efficient providers build data center space at about $5-9m per megawatt, while enterprise data centers cost twice as much

Large data centers exploit the economies of sheer size. The most efficient providers build data center space at about $5-9m per megawatt, compared with roughly double that range for most enterprise data centers at the moment. “Large internet providers are further able to improve their efficiency through expansion because they often build additional capacity at existing campuses, where basic infrastructure already exists,” says Paschke. “We [451 Research] expect that a higher level of capex spending will likely continue over the next few years for these companies as they continue to build out data center capacity.”

If outsourcing continues, in a decade the traditional enterprise data center will not exist. What will remain are enterprise IT control centers; managing processing that is done in mega data centers.

All the enterprise data will be stored in these mega data centers, alongside data from cloud service providers and data aggregators. Multiple telecoms operators will provide long-distance links to users, partners, suppliers and other mega data centers.

These facilities will be software-defined, with compute, storage and network functionality provided by layers of control software running on commodity hardware. u Storage will be located close to the servers in a “Server SAN” model.

The new architecture has lower costs of IT management and can access vast amounts of industrial internet data at high speed. 

mega projects
– Sam Falconer @ debut art

Better results, less downtime

Effective enterprise IT leaders will realize they can get far better results – probably minus the downtime and appalling gaffes in choosing software and hardware – using a cloud provider. This will let them exploit the data avalanche from the internet and Internet of Things, with real-time analytics. 

They will also get a seat on the board because they will be seen as strategic and not technical, since they will have given up hugging the server in the basement.

In the public sector, initiatives such as the UK Government’s G-Cloud will be revived worldwide to bring sense to government IT procurement. The independent software vendor (ISV), quietly grazing like a modern-day brontosaurus on lucrative government contracts handed out by witless civil servants, will die out.

Colocation will still be needed. Legacy applications will run on their original hardware, but shifted into the mega data center using a colocation model, allowing transport of data to and from other clouds, Infrastructure as a Service (IaaS) or Platform as a Service (PaaS) applications.

Most business executives see IT as inefficient, badly designed, slickly marketed, high cost, very slow to change, and run by people who speak a language designed to exclude them. Line-of-business executives want to use external IT services to achieve business goals and expel the last High Priests of IT. 

The software defined data center will be the key ingredient. Its components are: open hardware stacks (eg, Open Compute Project) and open software stacks (eg, OpenStack). For example, a combination of OCP and OpenStack, plus a data center integration and orchestration layer to optimize power and cooling, provides a powerful base for next-generation cloud infrastructure.

With this low-cost automated infrastructure, mega data centers can provide the best platforms for IaaS and PaaS offerings to enterprises, cloud service providers and large government organizations. The next-generation of cloud operating systems will control server, storage and network resources, as well as facilities infrastructure; that means power, cooling, lighting and routing of connectivity will all be controlled by the operating system.

The traditional data center will radically change over the next decade from one that is stacked and hardware-centric to one that is software-led. Organizations should keep what remains in the enterprise IT control centers to a minimum. 

Want to meet the Mega Projects? Here are three….

mega projects
– Sam Falconer @ debut art

Sweden: Facebook extends in Luleå 

Facebook hasn’t said how much it is spending on its giant data center site in Luleå, Sweden, but construction costs on the first building were estimated at five billion kronor ($760m), and the second one will be similar. 

Near the Arctic Circle, the site can use external air cooling all year round, and has renewable energy from multiple sources. 

A set of three buildings, each 28,000 sq m (300,000 sq ft) were planned. One came online in 2012, and work began on the second one in 2014, using Facebook’s rapid deployment data center (RDDC) design, which has been shared with the Open Compute Project for other operators to use. 

 In Luleå 1, compute equipment is on the bottom floor, a hot air plenum makes up the middle floor – used for transporting air throughout the building and mixing cold air coming in and exhaust going out – and the top story contains a full cooling system. See May/June 2014 issue for more details.

USA: Google adds $1bn to Iowa site

A $1bn investment in Council Bluffs will take Google’s total spend on the site to $2.5bn, and could make it Iowa’s biggest economic development project. 

The site has been developing since 2007, and is supported by $16.8m in tax incentives from the Iowa government. The search giant is expected to ask for $19.8m more in sales and use-tax refunds. In return, the company would have to create as many as 70 new jobs in the area if the new funding is approved.

Iowa makes an attractive location for data centers thanks to cheap land, a low natural disaster risk and access to renewable wind energy.

Microsoft is currently building its second data center in Iowa, at a cost of $1.2bn. Codenamed ‘Alluvion’, the facility in West Des Moines will power its Azure public cloud service. And Facebook has recently opened the first phase of its modular, air-cooled data center in nearby Altoona – its first to be powered by 100 percent renewable energy.

mega projects
– Sam Falconer @ debut art

Korea: Microsoft’s big plans 

Microsoft is rapidly building data centers for its Azure cloud service around the world, with a view to offering customers business software as a service along with cloud infrastructure. 

Asia is going to be an important region for this activity, as the cloud is growing fast there, and providers are establishing data centers to meet demand for delivering services in which data is held locally for reasons of compliance.

For the past year, Microsoft has been negotiating with government officials in South Korea, aiming to build a data center in the country’s largest port city, which could well mean an investment of several billion dollars. 

Reports say that any eventual data center would most likely be built on a 1.8 million sq ft site near a data center run by LG.

In the meantime, Microsoft covers the region with services delivered from data centers owned by Equinix in Hong Kong and Singapore.

This article first appeared in the May issue of our magazine. Subscribe, or read it online here!