Statistical Physics, Neural Networks, and Neuroscience: From Then to Now
New York University (NYU) via YouTube
Finance Certifications Goldman Sachs & Amazon Teams Trust
Master AI and Machine Learning: From Neural Networks to Applications
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a seminar lecture that traces the transformative developments in machine learning during the early 1980s, highlighting the groundbreaking contributions of John Hopfield and Geoffrey Hinton from 1982 to 1986. Learn about fundamental concepts that shaped modern AI, including associative memories, recurrent neural networks, generative models, and layered neural networks trained by gradient descent. Discover how theoretical physicists, armed with statistical physics expertise, embraced these developments and helped establish the field of theoretical and computational neuroscience. Examine the historical significance of this early work and its lasting impact on contemporary neuroscience research through practical examples and applications.
Syllabus
ECE AI SEMINAR: Statistical physics, neural networks, and neuroscience: from then to now
Taught by
NYU Tandon School of Engineering