Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Attend this seminar presentation exploring advanced stochastic optimization techniques through the lens of control variates. Discover how least-squares control variates can enhance stochastic gradient methods, presented by Professor Fabio Nobile from EPFL - Ecole Polytechnique Fédérale de Lausanne. Learn about the mathematical foundations and practical applications of this approach to reducing variance in stochastic gradient computations, which is crucial for improving convergence rates in machine learning and optimization algorithms. The presentation is part of the "Representing, calibrating & leveraging prediction uncertainty from statistics to machine learning" research programme at the Isaac Newton Institute for Mathematical Sciences, focusing on the intersection of statistical methods and modern machine learning techniques. Gain insights into how control variates can be systematically constructed using least-squares principles to create more efficient stochastic optimization algorithms, with potential applications in deep learning, Bayesian inference, and computational statistics.
Syllabus
Date: 3rd Jul 2025 - 10:30 to 11:30
Taught by
INI Seminar Room 2