Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Are Global and Local Minima of Shallow Neural Networks Fundamentally Different?

Paul G. Allen School via YouTube

Overview

Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Explore the fundamental differences between global and local minima in shallow neural networks through this 39-minute workshop presentation by Rahul Parhi from UC San Diego. Delve into the theoretical foundations of neural network optimization landscapes and examine whether global and local minima exhibit fundamentally distinct characteristics in shallow architectures. Investigate the mathematical properties that distinguish these critical points and their implications for training algorithms and convergence behavior. Analyze the geometric structure of loss surfaces in shallow networks and understand how the nature of these minima affects the optimization process. Gain insights into current research findings that challenge or support conventional understanding of neural network optimization theory, with particular focus on the practical consequences for machine learning practitioners working with shallow network architectures.

Syllabus

IFDS Workshop–Are Global and Local Minima of Shallow Neural Networks Fundamentally Different?

Taught by

Paul G. Allen School

Reviews

Start your review of Are Global and Local Minima of Shallow Neural Networks Fundamentally Different?

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.