AI Adoption - Drive Business Value and Organizational Impact
Master Finance Tools - 35% Off CFI (Code CFI35)
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Watch a 57-minute conference talk from the Big Data Conference 2024 featuring Princeton University's Boris Hanin exploring the analytical study of neural networks through scaling limits. Dive into the examination of how taking structural network parameters to infinity leads to simplified learning models. Learn about the theoretical insights connecting model architecture, training data, and optimizer impacts on learning processes. Understand the practical implications for hyperparameter transfer while exploring various approaches to studying neural network behavior at scale. Gain valuable knowledge about the mathematical foundations underlying modern deep learning systems and how their behavior changes as different parameters approach infinity.
Syllabus
Boris Hanin | Scaling Limits of Neural Networks
Taught by
Harvard CMSA