Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

SGD Exact Dynamics in High-Dimension - Insights for Algorithm Design

HUJI Machine Learning Club via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the theoretical foundations of stochastic gradient descent (SGD) in high-dimensional settings through this machine learning lecture that reveals how SGD dynamics converge to low-dimensional ordinary differential equations. Discover a unified framework for analyzing SGD behavior across generalized linear models and multi-index problems trained on Gaussian data with general covariance, encompassing important models like logistic regression, phase retrieval, and two-layer neural networks. Learn how this theoretical approach provides insights into the surprising practical effectiveness of stochastic optimization methods that are central to modern machine learning. Examine two key applications of this framework: first, understand how data anisotropy influences the behavior and performance of stochastic adaptive methods including line search and AdaGrad-Norm, and second, analyze differentially private SGD with gradient clipping to see how this framework yields improved risk-estimation error rates in challenging aggressive clipping scenarios. Gain insights from research that bridges machine learning, statistical physics, and high-dimensional probability to better understand algorithm design principles for stochastic optimization in contemporary machine learning applications.

Syllabus

Thursday, December 18th, 2025, AM, room C221

Taught by

HUJI Machine Learning Club

Reviews

Start your review of SGD Exact Dynamics in High-Dimension - Insights for Algorithm Design

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.