Edge applications imply that data is collected and acted upon by local users or devices, and that not all of this data has to travel to the "core." In fact, it is preferable if most of it doesn’t.
To illustrate, Iceotope’s Peter Hopton gave the following example: “Imagine you walk into a room and there’s a sensor recording masses of data, such as ‘is there someone in the room this second.’
“Eventually, you turn that data into a bunch of statistics: ‘today on average so many people were in the room, these were the busy times of the room.’ That’s processed data, but the raw data is just bulk crap. You want to pull the gems out of that in order to make it usable.”
Sifting through the dirt
In the case of autonomous cars, for instance, you might have “seven videos in HD and a set of sensor radar,” but “all you want to know is where are the potholes.”
“So you take that data, you put it into a computer at the Edge of the network, finding potholes, diversions and traffic conditions, and reporting them back to the core. Next day other vehicles can be intelligent, learn, and change their rule set.”
Similarly, HPE installed AI systems for a customer in the food and beverage industry, and placed servers at the Edge for the collection and observation of data, sending it back to the core for the learning part of the process.
You can’t shift all the data to the core, whether centralized colocation or cloud data centers, Hopton explained, because of the limits of physics.
“If you transmitted all that raw data from all seven cameras and that radar and sonar from every autonomous car, you would just clog the bandwidth.”
He added: “There are limits for the transmission of data that are creeping up on us everywhere. Every 18 months we get twice as much data from the chips using the same amount of energy because of Moore’s Law, but we’ve still got the ceiling on transmitting that.”
For Steven Carlini, the director of innovation of Schneider Electric’s IT division, putting compute at the Edge is “an effort from customers that want to reduce latency, the primary driver of Edge - in the public eye, at least.”
A lot of the cloud data centers were built outside urban or very populated areas, firstly because infrastructure operators didn’t want to have to deal with an excess of network traffic, and secondly because it cost them less to do so.
As the shift to the cloud occurred, however, it became clear that latency was “more than they would like or was tolerable by their users."
“The obvious examples are things like Microsoft’s Office 365 that was introduced and was only serviced out of three data centers globally. The latency when it first came out was really bad, so Microsoft started moving into a lot of city-based colocation facilities. And you saw the same thing with Google.”
As well as addressing issues of bandwidth and latency, the Edge helps lower the cost of transmitting and storing large amounts of data.
This, Carlini said, is an effort on service providers’ part to reduce their own networking expenses.
Another, perhaps less obvious argument for Edge, explained Hopton, is that as 5G brings about a massive increase of mobile data bandwidth; an unintended consequence will be that mobile phone batteries will run out of charge sooner, because they will be transmitting a lot more data over the same distance.
The obvious answer on how to improve on this, he said, is to make the data “transmit over less distance.”
“Otherwise, everyone’s going to be charging their phones every two hours.”
However distributed it is, the Edge won’t replace the cloud. All of the results and logs will be archived using public infrastrcture.
“It’s not a winner takes all,” Loren Long, co-founder and chief strategy officer at DartPoints, told DCD. “Neither is the IT world in general.”
“The outlook that the Edge is going to compete with or take over the core,” he said, “is a very binary outlook,” which he joked is probably only used “to sell tickets to conferences.”
Long compared the situation to city planning, and the human body. Building residential streets, he said, doesn’t “lessen the need for highways,” nor does having capillaries at our fingertips reduce the need for large arteries.
“So as the Edge grows, the core is going to continue to grow.”
This doesn’t mean that applications at the core, or the cloud computing model, will remain the same, however. “But it’s not going to kill anything; everything is going to continue to grow.”
Long talks about a stratification of processing, “where a lot more data analytics and processing and storage happens at the core, but onsite processing happens at the Edge.”
“There’s no competition,” he said. “It’s all complementary.”
While the cloud might continue to grow, HPE’s Colin I’Anson thinks the percentage of data processed at the Edge will increase. Research firm Gartner agrees: according to a recent study, half of all data will be created or processed outside of a traditional or cloud data center by 2022.
At the moment, Edge is tantalizingly poised between hype and implementation. Some say the telecom Edge is ready to go, while others point to problems (see article).
We’ll leave the last word to the most optimistic provider, Schneider’s Carlini: “What we’re seeing is a lot of companies mobilizing to do that, and we’re actually negotiating and collaborating with a lot of companies. We’re actually looking at and rolling out proof of concept 5G test site applications.”
The opportunity is there, but it’s still taking shape. While that happens, everyone’s pitch is changing to meet the current best ideas about how Edge will eventually play out.
This article appeared in the August/September issue of DCD magazine. Subscribe for free today: