Streamlining network deployments for Enterprise AI
This session took place on September 23, 2024
Please complete the following form to access the full presentation.
This episode is available on-demand.
Generative models are driving AI advancements, demanding high bandwidth and low latency networks to handle more data with increasing complexity. Scaling infrastructure and adopting new technologies are essential, but choosing the right tools is key to achieving performance, efficiency, and future adaptability. For example, whilst NVLINK gives GPUs a boost, it’s the optical networks that will truly scale AI across data centers. During this episode we explore:
- The battle of Infiniband versus Ethernet: Which is right for different deployments?
- Practical strategies for implementing AI networks and the role of structured cabling
- The evolution of transceiver technology to meet AI requirements
- The role of IEEE and TIA standards in shaping the future of AI network deployments
- Brought to You by
- Panduit