Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the mathematical foundations of deep learning through a geometric lens in this hour-long lecture that examines function approximation by neural networks trained via stochastic gradient descent. Delve into sharp analytical results for the deep linear network (DLN), a phenomenological model that reveals unexpected connections to minimal surfaces, geometric invariant theory, and random matrix theory. Discover how these mathematical frameworks provide new conceptual insights into the nature of "true" deep learning, bridging theoretical mathematics with practical machine learning applications. Learn about cutting-edge research that unifies diverse mathematical disciplines to better understand the training dynamics and geometric properties underlying deep neural network architectures.
Syllabus
2:30pm|Simonyi Hall 101 and Remote Access
Taught by
Institute for Advanced Study