Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

MIT OpenCourseWare

Scaling Laws in Neural Architectures - Lecture 20

MIT OpenCourseWare via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the fundamental principles of scaling laws in neural architectures through this 38-minute lecture from MIT's Deep Learning course. Examine power laws and their mathematical foundations, understand the limitations and theoretical underpinnings of scaling relationships, and delve into the critical concept of batch size optimization. Learn how these scaling principles apply to modern deep learning systems and discover the theoretical frameworks that govern the relationship between model size, data, and computational resources in neural network training and performance.

Syllabus

Lec 20. Scaling Laws

Taught by

MIT OpenCourseWare

Reviews

Start your review of Scaling Laws in Neural Architectures - Lecture 20

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.