The US’ National Science Foundation has renewed the Chameleon project, a large-scale, reconfigurable, experimental environment for cloud research colocated at the University of Chicago and the University of Texas at Austin.
Originally announced in 2014, the research initiative has been extended another three years with a $10m grant, providing users with a large-scale, approximately 600-node cloud infrastructure with bare metal reconfiguration privileges.
The next phase
“In phase one we built a testbed, but in phase two we’re going to transform this testbed into a scientific instrument,” Kate Keahey, Argonne National Laboratory computer scientist and Chameleon project PI, said.
“We’re going to extend the capabilities that allow users to keep a record of their experiments in Chameleon and provide new services that allow them to build more repeatable experiments.”
Phase one experiments included research into cybersecurity, OS design and power management, with scientists simulating cyberattacks on cloud computing systems and developing machine learning algorithms that determine the most energy-efficient task assignment schemes for large data centers.
“Chameleon allows us to reach new communities of researchers that our current systems don’t serve,” Dan Stanzione, executive director at TACC and a co-investigator on Chameleon, said.
“While other TACC production systems support science that makes use of large scale computing, we’ve never had a way for researchers to experiment on the computing systems themselves. Chameleon provides a platform for computer scientists and other researchers to explore techniques and tools to make cloud computing systems and future computing platforms more effective.”
Phase two will se new hardware added, including additional racks at UChicago and TACC, as well as GPUs and Corsa network switches.