Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Emergence and Scaling Laws for SGD Learning and Learning Compositional Functions with Transformers

Simons Institute via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This talk by Jason Lee from Princeton University explores two key topics in deep learning theory: the emergence and scaling laws for stochastic gradient descent (SGD) learning, and how transformers learn compositional functions. Presented at the Simons Institute as part of their Deep Learning Theory program, the lecture delves into mathematical frameworks that explain how neural networks learn and scale, with particular focus on transformer architectures that have revolutionized natural language processing and other domains.

Syllabus

Emergence and scaling laws for SGD learning and Learning Compositional Functions with Transformers

Taught by

Simons Institute

Reviews

Start your review of Emergence and Scaling Laws for SGD Learning and Learning Compositional Functions with Transformers

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.