Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This lecture from the Simons Institute features Elad Hazan from Princeton University exploring how to build neural architectures that transcend traditional Transformers by applying dynamical systems principles. Discover a novel approach to sequence modeling that leverages online control of dynamical systems to achieve superior long-range memory, faster inference, and provable robustness. Part of "The Future of Language Models and Transformers" series, this one-hour talk examines the spectral perspective on transformers and investigates methods for reducing the dimensionality of language processing systems.
Syllabus
Reducing the Dimension of Language: A Spectral Perspective on Transformers
Taught by
Simons Institute