Edge computing will change the way data centers are provided, along with most of our daily lives, according to keynote presenters at DCD>Zettastructure in London today.
New applications will require on-demand data center capacity that can only be built with new approaches, according to Suresh Kumar, Microsoft’s vice president of cloud infrastructure. The collision of data processing with networks at the edge will turn everyone into telecoms players, said Cole Crawford, founder and CEO of Vapor IO. And the edge will have to scale beyond human abilities if it is to deliver on its promise of fixing and optimizing technology such as connected cars, in real time, Jim Fletcher, former CTO of IBM Watson added.
Just in time capacity
“We need to take the cloud where our customers are,” said Kumar. This requires partners, he said, as Microsoft has been building out cloud regions very quickly, with its infrastructure equally split between its own data centers and leased facilities.
“It is very important for us to have standard building blocks to deliver the infrastructure we need,” he said. This allows independent failure zones taking in leased space and Microsoft’s own space.
“Building a data center takes a lot longer than the time it takes to stuff that data center with servers,” he said. That’s why, in the Olympus project, Microsoft is collaborating with partners on data center server designs, to deliver functions and capacity just-in-time.
“Standardization by itself is not much use,” he said. “We have to have a mechanism to do deep planning together.” Instead of working project-by-project, Microsoft is sharing its long term plans so partners can deliver servers as needed to provide just-in-time capacity.
He also mentioned Microsoft’s involvement in building international fiber, and its gas powered data center prototype built with McKinstry and Cummins - a technology “that probably represents the future of what data centers are going to be.”
Scaling too fast for humans
We need to solve for the next generation of users,” said Crawford, explaining that edge applications will create masses of data which must be handled very quickly - for instance because cars move at high speed.
“Connected cars need a four or five millisecond round trip time,” he said. You have a sub 5ms decision time, so you need to have GPUs very close to that car, so data can get to that car and back.”
Virtual reality will likewise need local processing, because a human eye can distinguish 5.4Gbps of data, and fast processing to satisfy our sense of balance: ”If you have anything more than a 5ms round trip [for VR data], you will feel ill.”
All of this will make edge resources very dependent on networks - and in particular the licensed spectrum of mobile devices, which most people now consume the Internet.
Networks with thousands of locations will have to be automated, said Crawford, “because humans do not scale well for the things we will be building in the next 20 years.”
Short shelf-life data
“Gartner says 90 percent of all data is useless when it is collected, but I reckon it’s more like 99 percent,” said Fletcher, who has recently moved from IBM to advisory firm Momenta Partners, warning that edge applications will require a completely different mindset: ”Data has a shelf life, like milk. Put milk in the fridge and after a week it’s spoilt. With data it can become useless in a few seconds.”
Fletcher said: ”The only way IoT will succeed is with edge,” predicting that by 2020 it will be a $20 billion market, handling 600 zettabyte per year - and perhaps adding $14.2 trillion to the world’s economies.
The IoT is more than just the application of IT to things, he said, but will move from operating things, to instantaneous automatic optimization: “The instrumentation will allow us to take data and transform it to actions before any problems occur.”
He echoed Crawford’s prediction that this will be more than human beings can manage unaided: ”There are going to be too many sensors for human reactions.”