From Fans to Fluids - The Critical Need for Liquid Cooling in AI Data Centers
Open Compute Project via YouTube
Build with Azure OpenAI, Copilot Studio & Agentic Frameworks — Microsoft Certified
AI, Data Science & Business Certificates from Google, IBM & Microsoft
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the critical transition from air to liquid cooling systems in modern AI data centers through this 15-minute conference talk. Learn why traditional air cooling has reached its limits as AI infrastructure pushes power densities beyond 700W per GPU and rack power envelopes exceed 80kW. Discover the data-driven tipping point that makes liquid cooling essential rather than optional, particularly when facility power density envelopes surpass 40KW where air circulation becomes physically impossible. Understand how liquid cooling enables ultra-high compute density for modern AI data centers, becoming required at power levels above 1000W per server and particularly interesting above 2000W. Examine the necessity of thermal modeling software that supports both liquid and air cooling systems within the same model. Gain insights into real-time monitoring of flow rates, pressure, and thermal data, along with API-based management strategies for energy efficiency optimization in AI and HPC environments, all presented within the framework of Open Compute Project's Technology Cooling Systems principles.
Syllabus
From Fans to Fluids The Critical Need for Liquid Cooling in AI Data Centers
Taught by
Open Compute Project