When commencing their studies at a university, many students spend some time in temporary accommodation on campus, before they move into a real property.

Turns out this can also happen to supercomputers: for 12 months, the University of Exeter in the UK was housing its ‘Isca’ high performance computing (HPC) machine in a custom-fabricated container from Stulz, as a new data hall was being built.

The hardware has since been successfully transferred to its new home, where it is contributing to research in engineering, CFD, mathematics, space and life sciences.

Semester to semester

University of Exeter campus
University of Exeter campus – UoE

Isca was designed and configured by British HPC specialist OCF with the assistance from Lenovo, using the company’s NeXtScale server nodes, Mellanox EDR Infiniband networking, and GS7K parallel file system appliances from DDN Storage.

“As well as having the standard nodes, we also have various pieces of specialist kit which includes Nvidia GPU nodes, Intel Xeon Phi nodes and OpenStack cloud nodes as well,” said David Barker, technical architect at the University of Exeter.

“We wanted to ensure that the new system caters for as wide a variety of research projects as possible, so the system reflects the diversity of the applications and requirements our users have.”

Stulz / TSI modular data center
Example of a Stulz / TSI modular data center – Stulz

Isca was then housed in a Rapid Deployment Data Centre (RDDC) from Stulz Technology Integration Limited (formerly TSI UK).

RDDC from Stulz is a containerized facility that is built to order and certified for shipping anywhere in the world. Unlike some of the other modular solutions, it does not involve an actual shipping container, but a purpose-built enclosure in the familiar form-factor. OCF and Stulz worked together to fit out the RDDC and in June 2016, had the entire system delivered in its container to the University.

“This was phase one of the new supercomputer, located on campus in the specialized container, where the machine ran for the first twelve months,” Barker said. “We tested and used the system while it was housed in the temporary location to give us an understanding of what we used a lot of; this informed phase two of the project which was to expand the system with the help of OCF and move it to its final location in the new data centre on campus.”

The project was funded by the University using a large grant from the Medical Research Council. The HPC resource is now in use by more than 200 researchers across 30+ active research projects.