Toward a Grand Unified Theory of Accelerations in Optimization and Machine Learning
Erwin Schrödinger International Institute for Mathematics and Physics (ESI) via YouTube
Overview
Why Pay Per Course When You Can Get All of Coursera for 40% Off?
10,000+ courses, Google, IBM & Meta certificates, one annual plan at 40% off. Upgrade now.
Get Full Access
Explore the cutting-edge developments in optimization and machine learning acceleration techniques in this 28-minute conference talk by Ernest Ryu at the Erwin Schrödinger International Institute for Mathematics and Physics. Delve into the foundational concepts of momentum-based acceleration in first-order optimization methods, originally introduced by Nesterov, and their significant impact on large-scale optimization and machine learning. Examine the ongoing challenge of finding a fundamental understanding of acceleration and discover the recent emergence of new acceleration mechanisms distinct from Nesterov's approach. Analyze the similarities and differences among these novel acceleration phenomena, which offer promising avenues for addressing long-standing open problems in the field. Gain insights into the speaker's vision of developing a unified mathematical theory encompassing various acceleration mechanisms and understand the challenges that must be overcome to achieve this goal. This talk, part of the "One World Optimization Seminar in Vienna" workshop, provides a comprehensive overview of the current state and future directions in optimization acceleration research.
Syllabus
Ernest Ryu - Toward a grand unified theory of accelerations in optimization and machine learning
Taught by
Erwin Schrödinger International Institute for Mathematics and Physics (ESI)