Free courses from frontend to fullstack and AI
Live Online Classes in Design, Coding & AI — Small Classes, Free Retakes
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the concept of Hopfield networks in this 27-minute video lecture on the physics of associative memory. Delve into a foundational model underlying key ideas in neuroscience and machine learning, including Boltzmann machines and Dense associative memory. Begin with an introduction to the protein folding paradox, then progress through energy definition, Hopfield network architecture, inference, and learning processes. Examine the limitations and perspectives of this model before concluding with a brief discussion on related topics. Gain insights into the intersection of physics, neuroscience, and machine learning through this comprehensive exploration of associative memory systems.
Syllabus
Introduction
Protein folding paradox
Energy definition
Hopfield network architecture
Inference
Learning
Limitations & Perspective
Shortform
Outro
Taught by
Artem Kirsanov