Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Unlocking State-Tracking in Linear RNNs Through Negative Eigenvalues

AutoML Seminars via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Watch a technical seminar presentation exploring how extending the eigenvalue range of Linear Recurrent Neural Networks (LRNNs) to include negative values can significantly improve their state-tracking capabilities. Learn why current LRNNs like Mamba and RWKV struggle with state-tracking tasks, and discover how incorporating negative eigenvalues in their state-transition matrices enables them to solve fundamental problems like parity checking. Examine mathematical proofs showing that LRNNs with only positive eigenvalues cannot solve certain tasks, while those with extended eigenvalue ranges can learn any regular language. Explore empirical evidence demonstrating improved performance on state-tracking tasks and comparable stability in language modeling when using this enhanced approach. Gain insights into making modern LRNNs more expressive and versatile for applications in code evaluation, math processing, and other sequence modeling tasks, all while maintaining efficient linear scaling and training costs.

Syllabus

Unlocking State-Tracking in Linear RNNs Through Negative Eigenvalues

Taught by

AutoML Seminars

Reviews

Start your review of Unlocking State-Tracking in Linear RNNs Through Negative Eigenvalues

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.