Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Scaling Limits of Neural Networks

Harvard CMSA via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Watch a 57-minute conference talk from the Big Data Conference 2024 featuring Princeton University's Boris Hanin exploring the analytical study of neural networks through scaling limits. Dive into the examination of how taking structural network parameters to infinity leads to simplified learning models. Learn about the theoretical insights connecting model architecture, training data, and optimizer impacts on learning processes. Understand the practical implications for hyperparameter transfer while exploring various approaches to studying neural network behavior at scale. Gain valuable knowledge about the mathematical foundations underlying modern deep learning systems and how their behavior changes as different parameters approach infinity.

Syllabus

Boris Hanin | Scaling Limits of Neural Networks

Taught by

Harvard CMSA

Reviews

Start your review of Scaling Limits of Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.