Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore advanced performance optimization techniques for automatic differentiation (AD) in C++ through this conference talk from CppCon 2025. Discover how reverse-mode automatic differentiation powers critical applications from training trillion-parameter large language models to Bayesian inference engines in the Stan programming language. Learn about sophisticated performance strategies including arena allocators, expression templates, SIMD-friendly data structures, and template meta-programming that are implemented in modern C++ AD libraries to achieve significant speed improvements. Examine real-world engineering solutions that demonstrate how milliseconds saved per gradient computation can translate into hours of wall-time performance gains. See how contemporary C++ AD techniques have evolved to make automatic differentiation so efficient that hand-written derivatives are rarely necessary in most programs. The presentation showcases performance improvements across different C++ AD libraries over time and assumes familiarity with modern C++ but requires no prior knowledge of automatic differentiation concepts.
Syllabus
Optimize Automatic Differentiation Performance in C++ - Steve Bronder - CppCon 2025
Taught by
CppCon