2,000+ Free Courses with Certificates: Coding, AI, SQL, and More
Lead AI-Native Products with Microsoft's Agentic AI Program
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Attend this mathematical seminar where Professor Fabio Nobile from EPFL presents advanced techniques for improving stochastic gradient methods through least-squares control variates. Explore how control variates can be systematically incorporated into stochastic gradient algorithms to reduce variance and improve convergence rates in optimization problems. Learn about the theoretical foundations of least-squares control variates and their practical implementation in computational settings. Discover applications of these methods in machine learning and statistical computing, particularly in contexts where prediction uncertainty plays a crucial role. Examine the mathematical framework underlying variance reduction techniques and understand how they can enhance the efficiency of gradient-based optimization algorithms. Gain insights into the intersection of stochastic optimization, statistical methods, and uncertainty quantification from a leading expert in computational mathematics and numerical analysis.
Syllabus
Date: 3rd Jul 2025 - 10:30 to 11:30
Taught by
INI Seminar Room 2