Statistical Physics, Neural Networks, and Neuroscience: From Then to Now
New York University (NYU) via YouTube
Power BI Fundamentals - Create visualizations and dashboards from scratch
Get Coursera Plus for 40% off
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a seminar lecture that traces the transformative developments in machine learning during the early 1980s, highlighting the groundbreaking contributions of John Hopfield and Geoffrey Hinton from 1982 to 1986. Learn about fundamental concepts that shaped modern AI, including associative memories, recurrent neural networks, generative models, and layered neural networks trained by gradient descent. Discover how theoretical physicists, armed with statistical physics expertise, embraced these developments and helped establish the field of theoretical and computational neuroscience. Examine the historical significance of this early work and its lasting impact on contemporary neuroscience research through practical examples and applications.
Syllabus
ECE AI SEMINAR: Statistical physics, neural networks, and neuroscience: from then to now
Taught by
NYU Tandon School of Engineering