Technologists love to point out the many flaws of humans. They make mistakes, they need breaks, they expect a salary.
Robots and automation, they argue, can replace or augment many of the jobs a human worker currently does, either freeing up the employee to do other work or freeing up the employer to have fewer workers.
But humans still have one major advantage: Adaptability. Our ability to handle multifaceted tasks, deal with sudden situational changes, and learn and improve is unparalleled, despite billions of dollars in robotics and AI research.
For the careful and conservative data center audience, wholeheartedly embracing new technologies is always tricky, and the idea of letting a mechanized system loose in a facility may cause some to pause.
“Many data center and robotics professionals are predicting that the next couple of years will be big leaps when it comes to placing more robotics in the data center environment,” Bill Kleyman - now Switch EVP of digital solutions - wrote in 2013.
This feature appeared in the 2021 Automation Supplement. Read it for free today
At the time, things looked promising, with robotic arms already commonplace in tape libraries. But the big leap Kleyman predicted still seems a way off.
Back in 2013, one of the biggest examples of a robot in a data center was IBM's pilot of a hacked iRobot (similar to the Roomba autonomous vacuum cleaner) that traveled around a data center tracking temperature and other data. The project was quietly shelved, most likely with a shift to gathering telemetry from fixed assets.
Two years before, the Korea Advanced Institute of Science and Technology tried a similar, but more ambitious, attempt. SCOUT was a mobile robot-based data center management system that followed NFC tags around a server room to inspect servers.
Also based on an iRobot, the small system patrolled the KAIST iCubeCloud Data Center in Korea, using vision-based monitoring to look for issues. A follow-up study promised to attach a robotic arm that could work on servers, but it was never published, and the authors moved to other projects.
“The server rack is more than 50 years old. There is no other piece of technology in data centers that has survived for so long," Zsolt Szabo told DCD back in 2016. At the time the CEO of web hosting company PayPerHost was pitching a robotic arm for the data center.
But despite those false starts, robots have quietly found jobs in and around the data center.
German Internet exchange company DE-CIX has rolled out a family of automated “patch robots," including Patchy McPatchbot, Sir Patchalot, and Margaret Patcher. These are based on X-Y gantries, and can locate a socket in an optical distribution frame, similar to a traditional patch panel, and then plug a fiber optic cable into it.
"We are the first Internet Exchange worldwide to use a so-called patch robot, which is pre-programmed to independently migrate customers' connections,” Harald Summa, CEO of DE-CIX, said when the company pulled off a data center migration with the robots' help.
“This move bore a resemblance to open-heart surgery, as we had to migrate customers during live operations."
Robots are being used in some hyperscale data centers for highly specific tasks. In 2018, Google revealed it was rolling out hard drive destroying robots - essentially stationary industrial arms that help pick up the drives and put them in the shredder, but that don't carefully remove them from servers.
Alibaba claims a more advanced system is in operation at five of its data centers. "The second-generation Tianxun robot is AI-powered and can work without human intervention to automatically replace any faulty hard disks," Wendy Zhao, senior director & principal engineer of Alibaba Cloud Intelligence told DCD.
"The whole replacement process, including automatic inspection, faulty disk locating, disk replacing, and charging, can be completed quickly and smoothly. The disk can be replaced in four minutes."
Google hopes to deploy more advanced systems at its data centers, but is wary of the complexity its facilities pose. "As far as robotics, our hyperscale data centers are more like warehouses and most of the processes require a robot to navigate to a specific location to perform a task," Google's VP of data centers Joe Kava said.
“However, even as advanced as robotics have become, many of the tests in data centers are much more complicated than in other industries that have employed large-scale robotic implementations."
Rival Facebook is also experimenting with robots. In 2020 it was revealed that the company, which declined to comment, has a Site Engineering Robotics Team that since 2019 has been designing "robotics solutions to automate and scale Facebook's data center infrastructure operations." Among the known projects are robots that move around the data center monitoring conditions - perhaps like IBM's old Roomba hack.
Kleyman’s Switch is also betting on a future where robotics plays a much larger role in the data center, with the company developing its own robot, the Switch Sentry, essentially a 360-degree camera and heat sensors on wheels that can act as a security guard. It travels autonomously, but humans take over remotely when an incident occurs.
The company said it hopes to turn the robot into its own business line, offering it to other companies, although it’s not clear if any have done so yet. But its creation, as well as the limited moves of hyperscalers and other data center companies, is the result of three major factors: The Edge, the pandemic, and dramatic advances in artificial intelligence.
"So presently in the United States, there are about 8,000 private security firms and about 18,000 enforcement agencies," Kleyman explained.
"This represents a really fragmented market and kind of a threat to distributed and Edge infrastructure. As these stretched security forces have become an issue, we've looked at automation."
Instead of relying on such groups, "physical robots can augment security capabilities and reduce the risk for human beings" he said, as Switch moves into Edge deployments with a partnership with FedEx. "They can even be like a remote hands operator where you can drive up to a cage for a customer, take a look at something, and then get a person out there to work on it."
In fact, the Switch Sentry robot has a strong family resemblance to the FedEx Autonomous Delivery Robot, using a chassis and curb-climbing wheels which are very similar to those built into the FedEx system by Segway designer Dean Kamen. Switch and FedEx are both sponsors of a Kamen-created robotics competition.
Many data center operators already use remote eyes to examine a situation before deploying human hands, especially as Covid has reduced the ability for people to travel, and disincentivizing unnecessary visits.
Mark Hamilton was in charge of setting up Nvidia's supercomputer the Kao Data Centre in Harlow, UK, but he's never seen it. "We bought one of these little telepresence robots - it actually has an Nvidia Jetson GPU with Arm cores inside it - and it sort of rolls around on two wheels and has a tablet on it," he told DCD.
"This system is halfway around the world," Hamilton said. The company ultimately installed self-opening doors so the robot could drive into the hot aisle unimpeded.
"We'll absolutely use them post-Covid - it just makes it easier for the specialists to not have to be co-located. The computer room is not a great place to be - it's noisy, it can be cold, it can be warm. You want to be there as much as necessary, but as little as possible.” It could also be useful for colos whose customers don’t want them to enter their cage.
Changing the doors in its Cambridge-1 data center was an incredibly small step, but arguably could be the first one towards designing data centers more for robots than for humans.
"What happens when the data center doesn't need to be designed for humans?" asks the Uptime Institute's VP of research Rhonda Ascierto.
"When you take that requirement away, and you only have, say, really small robots carrying out tasks, the shape of data centers could change. You could have very tall cylindrical data centers that fit in between buildings. You might not need oxygen. You could operate them at high temperatures. That really excites me."
A lot of that potential requires drastic improvements in how robots operate and understand the world. "Some of these technologies are in development right now – things like robot navigation, computer vision, motion planning and device tooling for what the robot will employ to do that operation,” Kava said. “These have advanced exponentially over the last few years,” he added, admitting that they still had far to go.
To speed up the development and testing of robotics in the real world, there are those that hope to simulate it in the virtual one. "Just imagine that you work on your robot all day, and then you build the software at night, and then when it's built, you run 100,000 tests in simulation, come in next day and get the statistics out," Unity's SVP of AI Danny Lange said.
His company is best known for developing its eponymous video game engine, but now believes that the simulation platform can be repurposed for industries like robotics.
"We have gotten into robotics by popular demand,” he said. “Over the years, many robotics developers have tried to use Unity, but there have been some shortcomings. What we have done is to basically start from the beginning and address all those shortcomings, which have primarily been physics.
“As a gaming engine, Unity has ‘Disney physics’ so you can bounce around and it's entertaining. What we added is a physics engine from Nvidia and APIs for articulation, and basically, common needs of roboticists to model common robotic models."
Now the company hopes to not only help simulate the digital twin of a robot, but the effect of time and different scenarios on the robot. “If you want to test a household robot, you can now generate millions of different furniture layouts that the robot navigates,” he said. “A lot of people say 3D models are important. It's actually 4D, because there's a sequence to it. The robot is not just sitting there, it is moving from one state to another.”
Still, simulation can only get you so far, especially when robots have to interact with unpredictable humans, as we can see in the lengthy simulated and real-world tests of self-driving cars that are still not ready for prime time.
Accidents will happen
In the data center, companies can be less open about robotics accidents than public self-driving trials. But they happen. Speaking to Protocol, former Google data center contractor Shannon Wait said the company used an automated machine to help lift heavy battery racks - but it was stopped after a few weeks when it pinned a co-worker to a wall. Google did not respond to requests for comment.
“There is a reality gap,” Lange said. “There's always a reality gap. There's something with the robot that may be slightly different from what you simulated. We try to make that smaller, to minimize that gap by using a lot of noise and randomization, but it’s there.”
It’s something that data center operators will have to bear in mind as more robotic solutions become available. They potentially offer a way to patrol and manage remote Edge locations, visit facilities virtually in a pandemic, and lift heavy equipment that humans struggle with.
Further in the future, they could allow for radically redesigned data centers built without people in mind.
But they are also a risk in their own right, and will need to be carefully deployed to not crush humans, damage servers, or become security risks instead of security guards.
“The future is really, really exciting,” Kleyman said this year. “Let go of your fear and some of those uncertainties and know that the adoption of autonomous and intelligent systems are going to compensate for a lot of the challenges that we might be facing in the near future.”
But, he added, “look for these kinds of solutions that are human-centric and aim to augment human capabilities in supporting our critical infrastructure.”