Exploring Energy Efficiency in Scientific and Industrial AI Workloads
Open Compute Project via YouTube
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the critical intersection of artificial intelligence and energy consumption in this 15-minute conference talk from the Open Compute Project. Delve into the energy efficiency challenges facing scientific research and industrial applications as AI workloads continue to scale and demand more computational resources. Learn about current approaches to measuring and optimizing energy consumption in AI systems, including hardware design considerations, software optimization techniques, and infrastructure strategies. Discover real-world case studies from scientific computing environments and industrial deployments that demonstrate practical methods for reducing power consumption while maintaining performance. Examine the trade-offs between computational efficiency and energy usage, and understand how organizations are implementing sustainable AI practices. Gain insights into emerging technologies and methodologies that promise to make AI workloads more environmentally responsible, including specialized hardware architectures, advanced cooling systems, and intelligent workload scheduling. Understand the broader implications of energy-efficient AI for both cost reduction and environmental sustainability in large-scale computing environments.
Syllabus
Exploring Energy Efficiency in Scientific and Industrial AI Workloads
Taught by
Open Compute Project