DCD>Project showcase - Does generative AI run on thin air?
We have entered a new era of AI, and the training and usage of generative AI requires large data centers and edge nodes, and clearly has significant associated energy costs. Change is ahead with computational requirements demanding new efficient and complex GPU/accelerator hardware with ever increasing heat fluxes. Thermal management requirements will challenge the fluid medium of thin air and enforce data centers to employ liquid cooling close to the microelectronics. So how will all of this pan out?
Through project examples this session will address the questions and challenges that arise from the new era that is now upon us - what future heat fluxes are expected? Will immersed cooling, direct-to-chip or any derivative of liquid cooling meet these needs? How large will be the impact on energy costs and the industry's drive for sustainability? And how will these developments act as a strong technical driver for change in data center design?