Despite the image of data centers as large, power-hungry facilities, the reality is that much of the anticipated growth in energy consumption will occur at the edge. While they are far less obtrusive than their cloud service counterparts, edge data centers contain mission-critical applications and, as such, must be designed, built, and operated to similar, if not the same standards of resilience, efficiency, and sustainability as hyperscale facilities.

According to Gartner, by 2025, 75 percent of enterprise data is expected to be created and processed at the edge. IDC also predicts massive growth, with the worldwide edge computing market expected to reach a value of $250.6 billion, with a compound annual growth rate (CAGR) of 12.5 percent between 2019–2024.

There are several factors driving the proliferation of data and its consumption at the edge. Among them is the demand for low-latency applications, including digital streaming from film, TV, and music platforms. The rise in IoT connected devices, artificial intelligence (AI), and machine learning is causing a surge in digital transformation across almost every industry. Many organizations are designing new experiences, reimagining business processes, and creating both new products and digital services that rely on innovative and resilient technologies to underpin them.

This is leading to more data being created and shared across the network, ultimately causing delays in transmission and download speeds, known as latency. To overcome such network congestion, data must, therefore, be stored and processed close to where it is generated and consumed, a trend known as edge computing.

One of the challenges that emerges from the prolific growth at the edge is the energy demands fueling the transformation. The cost of energy production and the need to shift to more sustainable operations has long required designers of large data centers to embrace sustainability strategies. Now the same attention must be paid to the design of smaller facilities at the edge.

Energy demands at the edge

Today various analysis suggests that data centers represent 1-2 percent of global electricity consumption, and by 2030 as much as 3000 TWh of energy will be used by IT, doubling the potential global electrical consumption. At the edge, deploying 100,000 data centers, each consuming 10kW of power would create a power consumption of 1,000MW for the IT energy alone. Assuming a moderate power usage effectiveness (PUE) ratio of 1.5 would mean these systems also emit the equivalent of 800k tons of CO2.

However, if each edge facility were standardized and designed for a PUE of 1.1, we could reduce the total CO2 emissions to 580K tons annually. Clearly, there is a need to apply the same due diligence to reducing power consumption at the edge as has long been in the case of larger data centers. Consequently, there is also a clear benefit in producing pre-integrated systems where standardization, modularity, performance, and sustainability form fundamental components.

These building blocks offer users the ability to design, build, and operate edge data centers for greater sustainability, while energy efficient technologies such as Lithium-ion UPS and liquid cooling can help to reduce burdens on the system, overcome potential component failures, and allow for higher performance without negatively affecting PUE.

Open and vendor-agnostic, next-generation data center infrastructure management (DCIM) platforms are also essential, not just from a remote monitoring perspective, but to drive energy efficiency, security, and uptime. However, with edge demands accelerating, how can industry professionals get an understanding of the impact edge computing is having on the world’s energy consumption, and how focusing on efficiency and sustainability can influence that?

Forecasting edge energy consumption

Schneider Electric has recently developed a new TradeOff Tool, the Data center & Edge Global Energy Forecast, which helps users to model and create possible global energy consumption scenarios based on a set of pre-input assumptions. This includes the design of the physical infrastructure systems and its associated power usage effectiveness (PUE) rating, as well as anticipated growth of data center and edge loads between now and 2040.

Based on these assumptions, the tool generates several forecast charts depicting total energy in TWh consumed by both edge and centralized data centers, total IT energy (TWh), total energy mix comparing the percentages consumed in the edge and central sectors and the IT energy mix between each sector.

In terms of data, the model utilizes a capacity analysis created by IDC in 2019. From this model, Schneider Electric was able to derive the likely ratio of centralized data centers versus edge IT load in 2021, which was split between 65 percent at the center and 35 percent at the edge. When predicting energy usage in 2040, the respective default ratios are 44 percent and 56 percent.

Based on these assumptions, the growth rate for centralized and edge data centers is calculated at 6 percent and 11 percent annually. The tool allows these values to be adjusted by the user to reflect differing growth rates as conditions and/ or assumptions change.

To derive the non-IT energy consumed by activities such as cooling and lighting, PUE values are estimated based on the assumption that as technology continues to evolve, or becomes more efficient via future generations, anticipated PUE ratings will also improve. For example, the tool’s default values assumes that a centralized data center's PUE will improve from 1.35 in 2021 to 1.25 in 2040, and that the average PUE of edge computing facilities will improve from 2.0 in 2021 to 1.5 in 2040.

PUE ratios are also adjustable, meaning the user can leverage the tool under different possible scenarios to see the impact that edge computing has on energy consumption and carbon impact.

Final thoughts

With dependency on mission-critical infrastructure continuing to increase at a dramatic rate, it’s crucial that energy efficiency and sustainability become critical factors in the roll out of edge computing infrastructure. Greater accuracy, especially in terms of energy use, is essential, and operators cannot afford to hit and hope, or become more efficient as they go.

While energy management software remains critical, it is the design of these systems which offers end-users a truly practical means of ensuring sustainability at the edge. It requires greater standardization, modularity, resilience, performance, and efficiency to form the building blocks of edge environments.

Further, by considering energy efficient deployment methodologies and embracing a culture of continuous innovation, operators can choose a more sustainable approach to edge computing.