Slow Convergence of Stochastic Optimization Algorithms Without Derivatives Is Avoidable
Institut des Hautes Etudes Scientifiques (IHES) via YouTube
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This lecture explores how the traditionally slow convergence rates of derivative-free optimization methods can be overcome. Discover how adaptive stochastic algorithms from the Evolution Strategy family can achieve asymptotic geometric convergence of mean square error, even on non-convex and non-quasi convex functions. The speaker explains the key differences between these approaches and finite-difference stochastic approximation (FDSA) algorithms like the Kiefer-Wolfowitz method, demonstrating how Markov chain stability analysis enables linear convergence guarantees. Learn about connections to the Covariance Matrix Adaptation Evolution Strategy (CMA-ES), a highly effective stochastic algorithm for complex derivative-free optimization problems. The 51-minute presentation by Anne Auger from Télécom Paris was hosted by the Institut des Hautes Etudes Scientifiques (IHES).
Syllabus
Anne Auger - Slow Convergence of Stochastic Optimization Algorithms Without Derivatives Is Avoidable
Taught by
Institut des Hautes Etudes Scientifiques (IHES)