Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Mean-Field Theory Insights into Neural Feature Dynamics - Lecture 1

International Centre for Theoretical Sciences via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore mean-field theory applications to understanding neural feature dynamics in this comprehensive lecture delivered at the International Centre for Theoretical Sciences. Delve into the mathematical frameworks that govern how neural networks learn and evolve features through training, with particular emphasis on infinite-width limits and their theoretical implications. Examine the probabilistic foundations underlying neural network behavior and discover how mean-field approaches provide insights into the collective dynamics of neural populations. Learn about the connections between statistical physics principles and machine learning theory, focusing on how these mathematical tools can illuminate the learning process in deep networks. Investigate the theoretical underpinnings of feature formation and evolution in neural systems, understanding how individual neuron behaviors aggregate to produce emergent network-level phenomena. This lecture forms part of the Data Science: Probabilistic and Optimization Methods program, which brings together cutting-edge research in probability theory, optimization, and their applications to modern machine learning challenges.

Syllabus

Mean-Field Theory Insights into Neural Feature Dynamics, Infinite.... (Lecture 1) by Cengiz Pehlevan

Taught by

International Centre for Theoretical Sciences

Reviews

Start your review of Mean-Field Theory Insights into Neural Feature Dynamics - Lecture 1

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.