Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Mean-Field Theory Insights into Neural Feature Dynamics - Lecture 2

International Centre for Theoretical Sciences via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore advanced theoretical insights into neural network behavior through mean-field theory in this lecture delivered by Cengiz Pehlevan at the International Centre for Theoretical Sciences. Delve into the mathematical foundations that govern neural feature dynamics and understand how infinite-width neural networks behave through the lens of statistical physics and probability theory. Learn how mean-field approaches provide powerful analytical tools for understanding the training dynamics and feature learning capabilities of deep neural networks. Examine the theoretical connections between neural network optimization and statistical mechanics, gaining insights into how networks evolve during training and how features emerge and interact. Discover how these theoretical frameworks can inform the design of more robust and adaptable machine learning systems. This presentation forms part of the comprehensive Data Science: Probabilistic and Optimization Methods II program, which brings together cutting-edge research in probability, optimization, and their applications to modern machine learning challenges.

Syllabus

Mean-Field Theory Insights into Neural Feature Dynamics, Infinite.....(Lecture 2) by Cengiz Pehlevan

Taught by

International Centre for Theoretical Sciences

Reviews

Start your review of Mean-Field Theory Insights into Neural Feature Dynamics - Lecture 2

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.