Live Online Classes in Design, Coding & AI — Small Classes, Free Retakes
Learn Python with Generative AI - Self Paced Online
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the intriguing phenomenon of overparameterized models in machine learning through this 1-hour 18-minute lecture by Fanny Yang from ETH Zurich. Delve into the paradox of large neural networks that achieve near-zero error on noisy datasets while still generalizing well to unseen data, challenging traditional notions of overfitting. Examine recent statistical literature that offers theoretical insights into this phenomenon, focusing on linear models. Gain a new perspective on overfitting and generalization that aligns with empirical observations in modern machine learning practices. Part of the Modern Paradigms in Generalization Boot Camp at the Simons Institute, this talk provides crucial understanding for navigating the complexities of training and deploying large-scale machine learning models.
Syllabus
Reconsidering Overfitting in the Age of Overparameterized Models
Taught by
Simons Institute