In the not-so-distant future, if not tomorrow, you will awaken feeling refreshed and alert from a good night’s sleep in a smart home with automatically adjusted temperature and lighting. Enjoy breakfast in a smart kitchen with a smart refrigerator that never forgets to order more milk or OJ. Head out for your morning run wearing an IoT-enabled athletic shirt that provides real-time biometric readings. And then drive your smart car to the smart city where you will undoubtedly do smart work in a smart building.

Cisco estimates that in two years the IoT will consist of 50 billion devices connected to the Internet. As smart systems and applications become ever more ubiquitous in our daily lives and businesses, enterprise leaders and IT decision-makers will need to leverage new methodologies and infrastructure to analyze the coming tsunami of this decentralized data.

Healthcare, transportation, utilities, oil and gas, retail and agriculture are just a few of the industries that will be transformed by smart systems and applications as technology evolves from a state of connected things to the Internet of Everywhere. But IoT deployments require information processing closer to the source of the data – the IoT devices themselves. Instead of incurring the cost and latency of sending this information to the public cloud or an on-premise centralized data center, businesses will need to incorporate edge computing within their infrastructures.

Why proximity matters

tumisu
– Pixabay / Tumisu

Broadly defined, edge computing localizes data acquisition and control functions, storage of high bandwidth content, and applications in close proximity to the end user. Edge solutions are inserted into a logical end point of a network, whether the public Internet or private network, creating a more distributed computing cloud architecture. This reduces the communications bandwidth needed between IoT sensors and the central data center or cloud application by performing analytics and knowledge generation at or near the source of the data.

There are four key reasons why edge computing will become essential to enterprise business operations and companies’ IT infrastructure:

  • Increased data speed through reduced computing latency
  • Security, because the data remains closer to where it was created
  • Scalability, because edge computing is fundamentally “distributed computing,” meaning it improves the resiliency, reduces network load and is easier to scale
  • Reduced cost through lowering the data transmission frequency and size back to the central cloud presence

Today, approximately 10 percent of enterprise-generated data is created and processed outside a traditional, centralized data center or cloud. But, according to the research and advisory firm Gartner, this figure will reach 50 percent by 2022.

Occupying the link between connected devices and the cloud, edge computing is comprised of local devices such as a network appliance or server that translates cloud storage APIs; localized data centers with one-to-ten racks that provide significant processing and storage capabilities, including prefabricated micro data centers; and regional data centers that have more than 10 racks and are located closer to the user and data source than centralized cloud data centers.

Single rack micro data centers can leverage existing building, cooling and power, thereby saving on CapEx associated with having to build a new, dedicated site. Multi-rack micro data centers are more capable and flexible due to scale, but require more installation time and their own form of dedicated cooling. These prefabricated single enclosure systems are suitable for a broad base of applications requiring low latency, and/or high bandwidth, as well as added security or availability.

Regional data centers have more processing and storage capabilities than localized, one-to-ten rack data centers, but also need dedicated power and cooling sources. Latency is dependent on the physical proximity to the users and data as well as the number of hops in between.

Optimizing the edge

Managing power usage in any data center environment, whether on-premises, a distributed data center architecture, or a regional or micro data center facility serving the edge, can be complex. The facility team for a particular building may be tasked with measuring and managing power at rack and Power Distribution Unit (PDU) levels, but often has limited visibility into server consumption. Moreover, there are multiple proprietary power measurement and control protocols supported by various solution providers, making it challenging to have a single solution for power management across all devices in the data center. A data center management solution that provides accurate, real-time power and thermal monitoring and management for individual servers, groups of servers, racks, and other IT equipment, such as PDUs, offers multiple benefits for IT administrators in edge computing environments.

Along with ease of use, simplicity of deployment, and interoperability among diverse server models as well as a variety of products from PDU and rack suppliers, a data center management solution can provide real-time power and thermal data for racks, rows, blades, and data center rooms, assisting IT staff to manage data center hotspots and perform power usage planning and forecasting.

Receipt of detailed information about server power characteristics can also help IT to set fixed-rack power envelopes and safely increase server count per rack, which improves data center utilization. Aggregating the data for the rack, row, and room with temperature data for the server inlets creates a real-time view of power consumed by the servers.

Lastly, a data center management solution can assist IT staff to manage power across various devices from multiple suppliers, eliminating the need for separate equipment-specific tools, and empowering data center managers to make data-driven decisions when maintaining operations during power outages — improving business continuity. A smart system or Internet-connected device, whether serving a consumer or industrial application, is only effective when it remains resilient and up and running.

Jeff Klaus is general manager of Intel Data Center Management Solutions