Free courses from frontend to fullstack and AI
MIT Sloan: Lead AI Adoption Across Your Organization — Not Just Pilot It
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the foundations of statistical learning theory in this comprehensive lecture from the Modern Paradigms in Generalization Boot Camp. Delve into classical 20th-century concepts, focusing on generalization through capacity control. Examine the Vapnik and Chervonenkis Fundamental Theorem of Learning, scale-sensitive capacity control and marking, and Minimum Description Length principles. Investigate parallels with stochastic optimization and explore generalization from an optimization perspective, including online-to-batch conversion, stochastic approximation, and boosting. Analyze how classic theory relates to current interests such as interpolation learning, benign overfitting, and implicit bias. Gain valuable insights from Nati Srebro of the Toyota Technological Institute at Chicago in this 1-hour 17-minute presentation hosted by the Simons Institute.
Syllabus
Overview of Statistical Learning Theory Part 1
Taught by
Simons Institute