Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the fundamental principles of scaling laws in neural architectures through this 38-minute lecture from MIT's Deep Learning course. Examine power laws and their mathematical foundations, understand the limitations and theoretical underpinnings of scaling relationships, and delve into the critical concept of batch size optimization. Learn how these scaling principles apply to modern deep learning systems and discover the theoretical frameworks that govern the relationship between model size, data, and computational resources in neural network training and performance.
Syllabus
Lec 20. Scaling Laws
Taught by
MIT OpenCourseWare