AI Product Expert Certification - Master Generative AI Skills
The Most Addictive Python and SQL Courses
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a 14-minute conference presentation from OOPSLA 2025 that introduces a formal approach to activity analysis for automatic differentiation within the MLIR compiler infrastructure. Learn how researchers from McGill University, University of Illinois at Urbana-Champaign, and Brium have developed a sound and modular method to identify inactive operations that don't contribute to partial derivatives of interest, addressing a critical optimization challenge in automatic differentiation tools. Discover how the team formally defines activity analysis as an abstract interpretation, proves its soundness, and implements it to achieve significant performance improvements - demonstrating a 1.24× geometric mean speedup on CPU and 1.7× on GPU compared to baseline implementations without activity analysis. Understand the novel intraprocedural approximation approach using function summaries that maintains 100% accuracy compared to whole-program analysis while providing better modularity. Gain insights into the formalization of MLIR's internal representation for automatic differentiation purposes and explore the practical implications for domains ranging from neural network training to climate simulations where computing derivatives is essential for performance-critical applications.
Syllabus
[OOPSLA'25] Sound and Modular Activity Analysis for Automatic Differentiation in MLIR
Taught by
ACM SIGPLAN