Edge computing is still the top subject in my inbox, as hardware vendors, data center providers, and everyone else scrambles to pile onto this perceived opportunity. But is edge really such a great deal, and are the problems with the concept really being solved?
Edge computing means the provision of resources close to where they are consumed, so that users get their streaming media quickly, and IoT devices can be monitored and controlled without a huge lag from a server in a rack that is physically close by.
Swimming against the tide
That’s simple enough, but there are issues here. In a lot of ways, edge computing goes against the flow of recent data center developments.
For the last several years, all the major forces in data centers have pushed towards centralization. There’s been colocation, virtualization and the cloud, all pulling IT into a virtual world, where it runs on large aggregations of physical resources that are built in places where they can be run most efficiently.
The server and storage pools in these facilities can be run with less redundant capacity, so as much as 80 percent of the hardware can disappear, compared with distributed, inefficient single-owner sites. And the overall facility can be run with less wasted energy and lower emissions, because it can be placed in a cool environment, and preferably one with plenty of renewable power available.
For the last several years, all the major forces in data centers have pushed towards centralization, cutting energy, emissions and up to 80 percent of the hardware. Build an edge fracility, and you lose most of these benefits
Build an edge fracility, and you lose most of these benefits - even if you build one with a clever design which makes the most of the small space, or which uses its urban location to find a market for its waste heat - as happens in Sweden.
Built on a small scale, without the opportunity to pick the location, it seems inevitable that edge data centers will be more expensive to provide.
Given this difficulty I suspect, in many cases, the need for edge data centers is being overstated. You may want low latency for your application, but do you need it? And do you need it so badly you are prepared to pay significantly more for the resources to provide it?
Can you pay for edge?
Financial services can always afford to the tech for a perceived imperative for speed, and streaming media services like Netflix have a real requirement to deliver data quickly and without jitter. For other services - in particular consumer services on a freemium model - speed versus cost may be the crucial trade-off. Realistic market projections will take this into account, I hope.
Even if edge resources are a real need for a particular customer, it may not be the sort of market vendors are hoping for. Some companies are selling uniform small modular systems, in the expectation that a broad range of customers can pick them up and use them directly.
But others aren’t so sure. Vertiv, the new spin-off that used to be Emerson’s Network Power division says it is going after the edge market. But it doesn’t have a specific product for it. Mike O’Keeffe, Vertiv VP for services, explained that the edge is not homogeneous, and the only way way to do it right is to start off with partners who sell to vertical markets: “Working with partners is a way to learn. We’ll take that learning down to smaller customers.”
A version of this article appeared on Green Data Center News