Build the Finance Skills That Lead to Promotions — Not Just Certificates
Build AI Apps with Azure, Copilot, and Generative AI — Microsoft Certified
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the concept of landscape connectivity in multilayer neural networks through this 45-minute lecture by Rong Ge from Duke University. Delve into the background of non-convex optimization, equivalent local minima, and symmetry. Examine the role of overparameterization in deep neural networks and investigate which types of local minima are connected. Learn about dropout stability, noise stability, and the process of direct interpolation. Discover how to connect a network with its dropout and analyze example paths for 3-layer networks. Review experimental results, draw conclusions, and consider open problems in the field of mode connectivity for deep learning.
Syllabus
Intro
Mode Connectivity[Freeman and Bruna 16, Garipov et al. 18, Draxler et al. 18]
Outline
Background: non-convex optimization
Equivalent local minima and symmetry
(Partial) short answer: overparametrization
Deep Neural Networks
What kind of local min are connected?
Dropout stability
High level steps
Direct Interpolation
Connecting a network with its dropout
An example path for 3 layer network
Example path explained (3) - (4)
Noise stability
Experiments
Conclusions
Open Problems
How can we use mode connectivity?
Taught by
Simons Institute