Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Slow Convergence of Stochastic Optimization Algorithms Without Derivatives Is Avoidable

Institut des Hautes Etudes Scientifiques (IHES) via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This lecture explores how the traditionally slow convergence rates of derivative-free optimization methods can be overcome. Discover how adaptive stochastic algorithms from the Evolution Strategy family can achieve asymptotic geometric convergence of mean square error, even on non-convex and non-quasi convex functions. The speaker explains the key differences between these approaches and finite-difference stochastic approximation (FDSA) algorithms like the Kiefer-Wolfowitz method, demonstrating how Markov chain stability analysis enables linear convergence guarantees. Learn about connections to the Covariance Matrix Adaptation Evolution Strategy (CMA-ES), a highly effective stochastic algorithm for complex derivative-free optimization problems. The 51-minute presentation by Anne Auger from Télécom Paris was hosted by the Institut des Hautes Etudes Scientifiques (IHES).

Syllabus

Anne Auger - Slow Convergence of Stochastic Optimization Algorithms Without Derivatives Is Avoidable

Taught by

Institut des Hautes Etudes Scientifiques (IHES)

Reviews

Start your review of Slow Convergence of Stochastic Optimization Algorithms Without Derivatives Is Avoidable

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.