NY State-Licensed Certificates in Design, Coding & AI — Online
Future-Proof Your Career: AI Manager Masterclass
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a 14-minute conference presentation from OOPSLA 2025 that introduces a formal approach to activity analysis for automatic differentiation within the MLIR compiler infrastructure. Learn how researchers from McGill University, University of Illinois at Urbana-Champaign, and Brium have developed a sound and modular method to identify inactive operations that don't contribute to partial derivatives of interest, addressing a critical optimization challenge in automatic differentiation tools. Discover how the team formally defines activity analysis as an abstract interpretation, proves its soundness, and implements it to achieve significant performance improvements - demonstrating a 1.24× geometric mean speedup on CPU and 1.7× on GPU compared to baseline implementations without activity analysis. Understand the novel intraprocedural approximation approach using function summaries that maintains 100% accuracy compared to whole-program analysis while providing better modularity. Gain insights into the formalization of MLIR's internal representation for automatic differentiation purposes and explore the practical implications for domains ranging from neural network training to climate simulations where computing derivatives is essential for performance-critical applications.
Syllabus
[OOPSLA'25] Sound and Modular Activity Analysis for Automatic Differentiation in MLIR
Taught by
ACM SIGPLAN