Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Gradient Optimization Methods - Implicit Bias and Benefits of Early Stopping

Simons Foundation via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore gradient optimization methods and their implicit bias properties in this 57-minute conference talk from the 2025 Mathematical and Scientific Foundations of Deep Learning Annual Meeting hosted by the Simons Foundation. Delve into the mathematical foundations of gradient-based optimization techniques commonly used in deep learning, examining how these methods exhibit implicit bias toward certain solutions. Learn about the theoretical benefits of early stopping in gradient optimization and understand how this regularization technique affects model performance and generalization. Gain insights into the intersection of optimization theory and deep learning practice, with particular focus on the mathematical principles that govern how gradient methods naturally bias learning toward specific types of solutions.

Syllabus

Peter Bartlett — Gradient Optimization Methods... (Sept. 26, 2025)

Taught by

Simons Foundation

Reviews

Start your review of Gradient Optimization Methods - Implicit Bias and Benefits of Early Stopping

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.