Exponentially Faster Gradient Method in the Presence of Ravines
International Centre for Theoretical Sciences via YouTube
Overview
Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Explore an advanced optimization technique in this conference talk that presents a novel gradient method achieving exponential acceleration when dealing with ravine-like structures in optimization landscapes. Learn how this innovative approach by Damek Davis addresses the challenging problem of optimization in the presence of ravines, which are narrow valleys in the objective function that can significantly slow down traditional gradient-based methods. Discover the theoretical foundations behind this exponentially faster gradient method and understand how it overcomes the limitations of conventional optimization algorithms when navigating complex terrain with steep-sided valleys. Examine the mathematical principles that enable this dramatic improvement in convergence rates and gain insights into the practical implications for machine learning and data science applications. This presentation is part of the Data Science: Probabilistic and Optimization Methods II program at the International Centre for Theoretical Sciences, focusing on cutting-edge theoretical developments in optimization that underpin modern machine learning systems.
Syllabus
Exponentially Faster Gradient Method in the Presence of Ravines by Damek Davis
Taught by
International Centre for Theoretical Sciences