Most AI Pilots Fail to Scale. MIT Sloan Teaches You Why — and How to Fix It
Learn AI, Data Science & Business — Earn Certificates That Get You Hired
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a comprehensive evaluation of mixed-precision tuning tools for floating-point programs in this 15-minute conference presentation from OOPSLA 2025. Discover how researchers Anastasia Isychev and Debasmita Lohar challenge conventional wisdom about the trade-offs between sound static analysis and dynamic sampling approaches in numerical optimization. Learn about their quantitative comparison measuring performance gains, optimization potential, and soundness guarantees—what they term "the cost of soundness." Examine their findings from experiments on the FPBench benchmark suite, which reveal that sound tools enhanced with regime inference can match or outperform dynamic optimizers for small straight-line numerical programs while providing formal correctness guarantees. Understand the limitations of standalone sound tools when accuracy constraints are tight, and discover how dynamic tools consistently exceed maximum allowed error by up to 9 orders of magnitude despite their effectiveness across different targets. Gain insights into floating-point optimization strategies, mixed-precision tuning methodologies, and the practical implications of choosing between conservative sound methods and aggressive dynamic approaches for numerical code optimization on resource-limited hardware.
Syllabus
[OOPSLA'25] Cost of Soundness in Mixed-Precision Tuning
Taught by
ACM SIGPLAN