Build the Finance Skills That Lead to Promotions — Not Just Certificates
MIT Sloan AI Adoption: Build a Playbook That Drives Real Business ROI
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the Tolman-Eichenbaum Machine, a computational model unifying memory and spatial navigation in the hippocampal formation. Delve into the model's architecture, including position and memory modules, and understand its step-by-step operation. Examine the model's performance, cellular representations, and its ability to predict remapping laws. Learn how this framework relates to Transformer networks and gain insights into cognitive map building. Discover the connections between computational neuroscience, artificial intelligence, and our understanding of memory and spatial navigation in this informative 24-minute video lecture.
Syllabus
- Introduction
- Motivation: Agents, Rewards and Actions
- Prediction Problem
- Model architecture
- Position module
- Memory module
- Running TEM step-by-step
- Model performance
- Cellular representations
- TEM predicts remapping laws
- Recap and Acknowledgments
- TEM as a Transformer network
- Brilliant
- Outro
Taught by
Artem Kirsanov