In early 2019, a small group of researchers launched an ambitious project that they hoped would change how data centers are built and regulated.

The idea? Build the world's most efficient data center. In just a few years, they would hit that milestone, developing a system with a power usage effectiveness (PUE) of just 1.0148.

It didn’t begin that way.

“We always wanted to have a showroom to highlight how our power is very clean here, and free of disturbances,” the director of the Boden Business Agency, Nils Lindh, explained.

“Our view was that you don’t need any backup power or UPS type function here.”

This article appeared in Issue 41 of the DCD>Magazine. Subscribe for free today

The Swedish municipality, already home to a number of data centers, envisioned a small deployment on municipal land, simply for the purpose of showing off its stable power, primarily provided by a hydroelectric dam.

To develop the project, the BBA turned to UK firm EcoCooling and Hungarian developer H1 Systems, both of whom had previously worked in Boden.

“And then as we conceptualized this idea and started talking to finance people, one of them pointed out this Horizon 2020 program,” H1's then-general director László Kozma explained. Horizon 2020 was a huge €60 billion research program that ran from 2014 and 2020. Nestled amongst its many tenders, Kozma found the EU was looking to build a data center with a PUE of below 1.1.

“I knew that a Hungarian company has only a two to three percent probability of being selected,” Kozma recalled. “But this might be that two percent - and we already had a good start, with an international consortium: a British cooling manufacturer, a Swedish municipality agency, and a Hungarian small / medium enterprise.”

BodenType DC One 10
– H1 Systems

It was time to expand the plan from beyond a simple showroom, to “something a lot more serious,” he said. The group brought in the Research Institute of Sweden (RISE), based in the nearby city of Luleå - which was already home to a huge Facebook data center.

“And after that, we went back to this project advisor who said that there was one thing still missing - the big European name,” Kozma said. “There’s this unofficial list of 25 research institutions whom you have to take into your consortium to raise the probability of your winning.”

It was well known that the larger economies got most of the Horizon 2020 money - science magazine Nature found that 40 percent of the program cash went to Germany, France, and the UK.

The group turned to the Fraunhofer Institute as the final member of the team, and Lindh concedes that political machinations were at play: “Germany being the largest contributor to the European Union, we thought it would be good to have a German research institute involved,” he said.

It worked. In October 2017, the group was awarded a €6 million contract titled 'Bringing to market more energy efficient and integrated data centers.' "That was the title they gave us, and it's what it would have been if we wrote it," Kozma said. "Our idea and the European Commission's idea just fitted 100 percent."

Now, the group had just 36 months to pull it off.

As the BBA began work on permitting, H1 drafted data center designs, and EcoCooling conceptualized cooling methods, the Fraunhofer Institute had one year to develop a system for synthetic workloads.

“Our responsibility in the project was to design a benchmark to emulate real world applications,” Fraunhofer’s head of Modeling and Networking Reinhard Herzog said. “And based on that benchmark, we tried to evaluate if the cooling policies work under the noisy behavior of real world applications, not just the stable artificial synthetic workloads that we used as tools.”

Based on their work building smart city tools for Hamburg, the Fraunhofer Institute created a set of workloads that “resembled a smart city application with a lot of sensor data flowing in and some stream processing, and then evaluation dashboard application workloads,” Herzog said. “And the other scenario we modeled was for predictive maintenance applications.”

Both were scaled up to the data center level, and designed so that the researchers could run the same workloads again and again as they tested out different cooling configurations.

“So, after all this preparation phase, it was six or seven months of building,” H1’s Kozma recalls. “The building was inaugurated in the first months of 2019. I remember it was fucking cold.”

DCD visited the facility at the time, with our very own Max Smolaks making similar observations on the unusually frigid temperatures at the time.

"What we are doing with this project is we are creating a very efficient, and therefore low cost, operating system, we are creating a very low cost building system, which is going to enable the little guys," Alan Beresford, EcoCooling MD, told us at the time. “By little, I mean truly small operators, compared to the world of multi-gigawatt operators: less than 100kW.”

Indeed, the Boden Type Data Center One was quite small - a 500kW deployment consisting of four modular pods. One was filled with Open Compute Project CPU servers gifted to RISE by Facebook, one filled with GPUs for rendering, and another bursting with crypto mining ASICs, with the fourth left as a control room.

In each of these pods, the team tried out its own approach to fighting heat: holistic cooling.

BodenType DC One 2
– H1 Systems

“We were able to take control of the fans in the servers and slow them down,” Professor Jon Summers, RISE scientific leader in data centers, said. “And we synchronize the IT fans with the cooler fans.”

Controlling the whole data center as a single system, the cooling was architected around keeping chips at a constant temperature, no matter the workload level. “There's a controller on the server that would constantly change the fan speed so that the CPU temperature was 60 degrees,” Summers said.

“And as the fan’s speed changed it would send that information to an algorithm which would then tell the cooler what speeds it needs to operate at to match the fan speeds of all these servers so that you get a neutral pressure.”

“It becomes a very well-balanced system, but you need the communication between the various layers.”

This proved remarkably effective at eking out efficiency gains, as the whole data center’s cooling system worked in unison, rather than different aisles and servers fighting each other.

“We achieved a PUE of 1.0148,” Summers said. “Yes, insane.”

The data center building was also designed for efficiency, dropping the plenum for a chicken coop design that allows for a natural chimney effect. “Did we know in advance we’d reach that PUE?” Kozma said. “No, at the beginning of the project, most of the people in our team thought we could reach 1.07-1.08.

By turning to holistic cooling, dropping the UPS, using a different design, and several other features, it’s hard to directly say just how big a part each innovation played in achieving a PUE record. “To answer that, I should have built a kind of a normal building next to this and measured them against each other,” Kozma said - but they only had a budget for the one system.

The location also provided advantages. “The call text from the EU was to go for the lowest PUE possible,” Summers said. “Putting an air-cooled data center in the north of Sweden, you have tons of fresh cold air,” although he added there were some challenges of dealing with the air when it was well below freezing.

“Obviously we took advantage of our geographical location, but we also took advantage of the fact that we had control. We went for the lowest inlet temperature we could possibly get away with, 15°C (59°F), which is easily achievable 7,500 hours of the year.”

H1 built a simulation to test out whether the BTDC would be feasible in other locations, using historical climatic data on six European cities. The data center could remain within ASHRAE conditions for five of the cities, but in Athens it would slightly step out of the boundaries “two or three percent of the of the hours in a year," Kozma said. "Of course, the climate is changing, and we used historical data," he cautioned.

There's also the issue that removing the UPS - responsible for a couple of points of PUE efficiency - is just not feasible for many locales.

Still, “the experiment worked,” Summers said. "Slowing everything down allowed us to achieve a much better PUE."

One issue with the result is PUE itself. "I'm very critical of PUE," Summers said. "It's not a metric that you would use to describe the energy efficiency of a data center in its entirety."

PUE is the ratio of the total amount of energy used by a data center, to the energy delivered to computing equipment.

Therefore, it does not penalize you for using inefficient IT hardware - you could run a 200MW IT deployment capable of a single petaflops of compute that could have a lower PUE than a 2MW deployment capable of 10 petaflops.

"The problem is that we didn't have another metric that we use to represent that," Summers said. "Although the Commission was interested in us exploring other metrics, or maybe coming up with a metric ourselves, there is no simpler metric than PUE, unfortunately."

The issue of PUE continues to exercise the data center sector in Europe. The EU has pledged to reach continental carbon neutrality by 2050, and the data center sector has promised to help, by reaching the goal by 2030, in a Climate Neutral Data Centre Pact. However, to convince the EU of its bona fides, the Pact has promised to create a new metric which will improve on PUE.

With all of PUE’s flaws, it’s still one of the few ways we have of measuring efficiency. At an annual PUE of 1.0148, BTDC outperformed every other facility in the world - including the previous frontrunner, the NREL's Energy Systems Integration Facility, which reached 1.032 in 2017.

Most of the commercial world is well short of this mark, but hyperscalers like Google and Facebook boast PUEs of 1.10 or less (in cooler countries), thanks to huge investments in energy efficiency, and some economies of scale.

It’s possible that hyperscalers may use some form of holistic cooling.

"We found out that they know what they're doing with cooling at Facebook, but they haven't told the world about it," Summers said. "I think when the European Commission discovered that they spent all this money to find out what Facebook are doing," he trailed off. "That's a cynical way of looking at it, but that wasn't the whole idea of the project, anyway."

Instead, BTDC hoped to prove just how efficient data centers could be if they put efficiency at the forefront of design, and to create a more open approach to holistic cooling, by putting the work in the public domain.

The project may also pressure server manufacturers to open up fan controls, and could also help European regulators which - despite the work of the Pact - still looks ready to crack down on this energy-hungry industry.

One hurdle to applying the work is that holistic cooling is not feasible for colocation data centers, where servers are owned by tenants, who will not hand over control to the colo owner. Colos simply cannot control every fan of every server in every rack.

Still, the project is finding life in enterprise data centers used by single clients.

EcoCooling "uses holistic cooling control in all its deployments now," Summers said. "I think that their customers are seeing the value in that immediately."

For H1, there has also been some demand. "There was a Bulgarian company who wanted what was built in Sweden," Kozma said. "For a small Hungarian company, it didn't make sense to go to Bulgaria to build, so we helped them to design it and after that a local company will build it. Another will be built in Norway with the same idea."

Fraunhofer, too, plans to commercialize the work. "The tool itself is open source," Herzog said. "But we're using it to make studies on the scalability of our applications, and on behalf of cities when they are trying to design what kind of application they need to rent from data centers."

As for Boden Type One, it's still there. Instead of knocking it down, asset owners H1 and EcoCooling sold the project. It's now set to be expanded and used as one of Europe's largest visual effects rendering farms.