Computational Benefits and Limitations of Transformers and State-Space Models
Free courses from frontend to fullstack and AI
Learn AI, Data Science & Business — Earn Certificates That Get You Hired
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the computational advantages and constraints of Transformers and state-space models in this 51-minute lecture by Eran Malach from the Kempner Institute at Harvard University. Delve into the mechanisms enabling retrieval, copying, and length generalization in language models, and examine how network architecture choices impact model performance on fundamental tasks. Discover theoretical and empirical evidence highlighting Transformers' superiority in copying and retrieval tasks compared to LSTM and state-space models like Mamba. Learn how Transformers' ability to copy long sequences can be harnessed for length generalization across various algorithmic and arithmetic tasks, providing valuable insights into the capabilities and limitations of different sequence modeling architectures.
Syllabus
Computational Benefits and Limitations of Transformers and State-Space Models
Taught by
Simons Institute