Launch Your Cybersecurity Career in 6 Months
Live Online Classes in Design, Coding & AI — Small Classes, Free Retakes
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn how to address catastrophic forgetting in Large Language Models through a conference talk that presents a novel full-parameter continual learning method using adaptive Singular Value Decomposition (SVD). Discover how to dynamically identify and protect task-critical subspaces while constraining updates to orthogonal low-rank directions, enabling models to retain previous knowledge without adding extra parameters. Explore the methodology that achieves up to 7% higher accuracy compared to strong baselines like O-LoRA while maintaining general language capabilities and safety. Examine the theoretical foundations and extensive empirical results that demonstrate a scalable approach toward continually evolving LLMs, presented by Nikhil Shivakumar Nayak at DevConf.US 2025.
Syllabus
Sculpting Subspaces: Constrained Full Fine-Tuning for Continual Learning in LLMs - DevConf.US 2025
Taught by
DevConf