There are variations emerging within the distributed data center model, where racks of servers heat homes and offices. The French experts reckon the best approach is supercomputing servers, operating at the room level
Distributed data center heat projects are a pretty radical idea: in exchange for you housing IT kit, the data center firm gives you its waste heat for free. But what sort of IT is involved, and what sort of heating system?
A couple of weeks ago I covered Germany’s Cloud&Heat, but before that, I mentioned France’s Qarnot Computing. This week Qarnot got in touch with more information about how things are going in France.
This idea is in its early stages, but both companies have real systems out there. Cloud&Heat has 20 heaters in a 56-apartment building in Dresden, but it turns out Qarnot has somewhat more.
Heaters in real homes
Qarnot has 350 of its Q.Rad heaters - which are smaller wall-mounted radiators - installed. Three hundred are in Paris apartment blocks, heating 100 households. Another 30 are in a school building, while 20 are at Qarnot employees’ homes.
All told they produce more than 150kW of heat, and Qarnot has a turnover of €1 million, says Benoit, with most of its paying customers in the finance sector.
Qarnot takes a distinct approach to both the heating and computing side of the equation. Its radiators are small, distributing the heating into individual rooms, instead of the larger cabinets which Cloud&Heat connects to building heating systems in Germany.
This is partly because of French regulations, Benoit explains: “CPUs heat water to 60C, but in France hot water systems have to work at 65C. We could do this, but would have to provide extra heating.”
But it also lends itself to an “Internet of things” approach. The individual Q.Rads are networked and switch on or off when required.
Qarnot also works with different computing tasks. While Cloud&Heat offers standard OpenStack-based cloud compute and storage, Qarnot provides specialised high-performance jobs. There are two payloads on offer at the moment: Q.Blender for graphics rendering, and XtremWeb for high energy physics.
The jobs are packaged as Docker-style containers, rather than virtual machines.
Compute intense tasks make heat
These compute-intensive tasks are more suited to the distributed heating model than general purpose computing, says Qarnot founder Paul Benoit (above). They can be run continuously at whatever speed is available, and can be downloaded to the system in batches.
These same properties have already enabled various “crowd computing” or volunteer projects, like SETI@home, which analyses radio signals looking for signs of extraterrestrial life, and Folding@home, which carries out protein folding tasks to analyse biochemical structures for cancer research.
In the Qarnot system, jobs are sent out to an encrypted cache in buildings where Q.Rads are installed. When people need heat, their Q.Rads get jobs from the cache.
The Q.Rads have standard Intel processors on simplified motherboards: “We get rid of elements other than processing, memory and network,” says Benoit. “There is no storage in the heaters. Everything is done in memory.”
By contrast, general purpose computing and storage needs more components, and can be quite passive, not actually producing significant heat unless data is being pulled and pushed from storage.
What about matching the demand for computing to the demand for heat? Qarnot can turn the speed of its processors down in summer so they don’t produce too much heat (though this does reduce the rate of computation). It also ensures it has a surplus of supercomputing tasks so customers can have heat when they want - by offering any spare capacity to universities.
“In winter we give capacity for student projects in 3D rendering for free,” says Benoit.
The Q.Rads in a school help in summer. These systems can continue to produce heat, since the school is empty in summer.
Overall, the processors in the distributed Qarnot farm run on average at 50 percent capacity. In summer they run at 25 percent, which Benoit says is pretty normal for a data center.
I’m hoping to get more details of what goes on at Cloud&Heat - and indeed from any other distributing computing systems I haven’t heard of yet.
A version of this story will appear on Green Data Center News.