Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

New York University (NYU)

Statistical Physics, Neural Networks, and Neuroscience: From Then to Now

New York University (NYU) via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a seminar lecture that traces the transformative developments in machine learning during the early 1980s, highlighting the groundbreaking contributions of John Hopfield and Geoffrey Hinton from 1982 to 1986. Learn about fundamental concepts that shaped modern AI, including associative memories, recurrent neural networks, generative models, and layered neural networks trained by gradient descent. Discover how theoretical physicists, armed with statistical physics expertise, embraced these developments and helped establish the field of theoretical and computational neuroscience. Examine the historical significance of this early work and its lasting impact on contemporary neuroscience research through practical examples and applications.

Syllabus

ECE AI SEMINAR: Statistical physics, neural networks, and neuroscience: from then to now

Taught by

NYU Tandon School of Engineering

Reviews

Start your review of Statistical Physics, Neural Networks, and Neuroscience: From Then to Now

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.