In the final installment of the Vertiv Edge Computing Innovation Day, DCD’s Alex Dickins sits down with three experts from Vertiv to discuss the realistic roadmap for distributed computing. Joining Alex is Greg Ratcliff, chief innovation officer; Greg Busch, advanced product development engineer for liquid cooling; and Anton Chuchkov, product manager for IT solutions.

The Edge up until now

The concept of Edge computing is ever-evolving, and rapidly. From a novel idea to a critical necessity, the Edge is instrumental in managing the high data volumes and real-time processing demands that define modern technology.

To kick off the conversation, Chuchkov highlights the diverse applications of Edge computing, which span from on-premises systems and robotics to data centers and even mobile platforms like self-driving cars.

He emphasizes the importance of lightweight, efficient solutions, particularly for mission-critical use cases such as autonomous vehicles.

Ratcliff elaborates on this by stressing that Edge computing must operate close to where it is needed, minimizing latency and optimizing energy efficiency. He notes:

“If I ask an Edge AI system to identify whether I'm welding a pipe that's round or flat, or whether something is a cat or a stop sign, the system needs to process this information almost instantly."

"By keeping computing near the point of use, we significantly reduce response times and can utilize less powerful hardware compared to massive, centralized systems. It's similar to how society functions, with giant warehouses, distribution centers, and local stores all working together – Edge computing scales in the same way.”

Innovations in cooling and power

The panel transitions to discussing the critical role of cooling technologies in Edge computing as data generation accelerates. Busch explains the shift from air cooling to liquid cooling, emphasizing its necessity for managing the increasing compute power required by modern applications:

“It may take some time for liquid cooling technology to proliferate to the Edge, but it is inevitable. As we see with autonomous vehicles generating terabytes of data that need instant processing, the demand for liquid cooling in Edge applications will continue to grow,” he asserts.

Ratcliff takes this further, discussing the rising density and energy requirements of Edge infrastructure, stating that traditional air-cooling methods are no longer sufficient on their own.

He explains how this challenge drives the need to redesign Edge facilities to accommodate high-performance computing: “With the need to cool increasingly powerful processors, we’re looking at liquid cooling directly to the chip, as well as using air cooling for other peripherals.”

The group delves into the complexities of deploying next-generation cooling technologies in Edge environments, emphasizing that the increasing sophistication of these systems will necessitate significant power considerations in future infrastructure planning.

What are we building the Edge for?

Chuchkov highlights key use cases driving Edge computing, particularly in autonomous vehicles and defense.

He notes: “I’ve observed a growing trend toward deploying large language models (LLMs) on-premises, with some providers offering models that can run efficiently on a single computer or A100 board rather than requiring full racks. These setups often utilize closed-loop liquid cooling, similar to high-end gaming rigs repurposed for LLM workloads.”

He further explains that lightweight LLMs are particularly suited for tasks like software support development, where inference rather than training is critical.

While many enterprises continue to prefer cloud-based LLMs for security reasons, on-premise solutions are gaining traction for specific applications. The group also discusses the evolving focus from training to inference workloads and how this shift impacts infrastructure design and deployment strategies.

Reflecting on the future, Ratcliff shares: “What excites me about today’s pace of change is the clarity it brings for the future. Working in innovation and pilot projects, I see untapped growth opportunities, such as utilizing surplus power in urban areas at night for Edge training. Every day, the future becomes clearer, revealing new possibilities that once required massive data centers.”

Collaboration and integration

As the discussion progresses, Dickins seeks to uncover how the panelists – engineers and business professionals – collaborate to ensure reference designs remain cohesive and reflective of various business priorities, particularly in light of geographical considerations for Edge data centers.

Busch explains: “ARPA-E, the Advanced Research Project Agency for Energy, a branch of the US Department of Energy (DOE), funds high-risk, high-reward technologies often overlooked by industry. It is a counterpart to the Defense Advanced Research Projects Agency (DARPA), which funds cutting-edge research and technology for defense and national security purposes.”

“One project funded by ARPA-E, COOLERCHIPS, aims to develop compact, deployable data centers that deliver over a megawatt of computing power while focusing on sustainability,” Busch continues. He adds that the project targets a power utilization effectiveness (PUE) of 1.05 or less, ensuring less than five percent of energy is consumed by supporting systems.

To achieve this, the team is implementing a hybrid cooling strategy that combines direct-to-chip two-phase cooling for high-load components with single-phase immersion cooling for peripherals, enhancing overall efficiency and enabling AI deployment across various settings.

High-density infrastructure at the Edge

Chuchkov emphasizes the importance of retrofitting existing systems and ensuring redundancy in cooling and power solutions to improve reliability. Busch adds insights into the significant efficiency gains achieved by adopting advanced liquid cooling methods, which unlock new levels of performance and enable higher-density infrastructure at the Edge.

Ratcliff, Busch, and Chuchkov then explore innovative ways to repurpose excess heat from Edge computing, considering its potential for applications like district heating and food production. Ratcliff underscores the importance of integrating power, thermal systems, and external pumps to maximize performance, while Busch highlights the importance of collaboration with chipset manufacturers to ensure that cutting-edge cooling solutions are effectively integrated with existing systems.

“As we discuss modules and systems, we’re really talking about systems integration,” Ratcliff explains. “With the time-sensitive demands of AI workloads, the solutions we develop must adapt to these changing requirements. This emerging workload has a unique profile – training activities differ from inference, and reliability needs in these areas can vary as well.”

Future headwinds and opportunities for Edge computing

Looking ahead, Chuchkov sees a shift toward application-specific chipsets that can deliver more power-efficient workloads tailored to the Edge market. The panel discusses the challenge of finding the right balance between updating old systems and rolling out these new technologies, both of which are crucial for optimizing Edge infrastructure.

Ratcliff points out that AI has the potential to dramatically boost software efficiency, with many gains still waiting to be realized. He suggests that the next big breakthrough could come from making the software on AI systems even more efficient:

“I think the next big shift will be in how AI systems actually improve the software they’re running,” Ratcliff says.

“We’ve been here before – back when CPUs were maxed out, we saw solutions like virtualization and multi-core processing emerge. I think we’re reaching a similar point with AI. As we push these systems to their limits, we’ll find new ways to boost efficiency, especially in the software, without needing huge hardware changes.”

Vertiv’s experts wrap up by stressing the need for a systems-thinking approach to handle the changing landscape of Edge computing. Embracing this mindset will be key to overcoming challenges and seizing new opportunities in the field.

Watch the full broadcast from the Edge Computing Innovation Day here.