From Fans to Fluids - The Critical Need for Liquid Cooling in AI Data Centers
Open Compute Project via YouTube
The Most Addictive Python and SQL Courses
PowerBI Data Analyst - Create visualizations and dashboards from scratch
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the critical transition from air to liquid cooling systems in modern AI data centers through this 15-minute conference talk. Learn why traditional air cooling has reached its limits as AI infrastructure pushes power densities beyond 700W per GPU and rack power envelopes exceed 80kW. Discover the data-driven tipping point that makes liquid cooling essential rather than optional, particularly when facility power density envelopes surpass 40KW where air circulation becomes physically impossible. Understand how liquid cooling enables ultra-high compute density for modern AI data centers, becoming required at power levels above 1000W per server and particularly interesting above 2000W. Examine the necessity of thermal modeling software that supports both liquid and air cooling systems within the same model. Gain insights into real-time monitoring of flow rates, pressure, and thermal data, along with API-based management strategies for energy efficiency optimization in AI and HPC environments, all presented within the framework of Open Compute Project's Technology Cooling Systems principles.
Syllabus
From Fans to Fluids The Critical Need for Liquid Cooling in AI Data Centers
Taught by
Open Compute Project