Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Sculpting Subspaces - Constrained Full Fine-Tuning for Continual Learning in LLMs

DevConf via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to address catastrophic forgetting in Large Language Models through a conference talk that presents a novel full-parameter continual learning method using adaptive Singular Value Decomposition (SVD). Discover how to dynamically identify and protect task-critical subspaces while constraining updates to orthogonal low-rank directions, enabling models to retain previous knowledge without adding extra parameters. Explore the methodology that achieves up to 7% higher accuracy compared to strong baselines like O-LoRA while maintaining general language capabilities and safety. Examine the theoretical foundations and extensive empirical results that demonstrate a scalable approach toward continually evolving LLMs, presented by Nikhil Shivakumar Nayak at DevConf.US 2025.

Syllabus

Sculpting Subspaces: Constrained Full Fine-Tuning for Continual Learning in LLMs - DevConf.US 2025

Taught by

DevConf

Reviews

Start your review of Sculpting Subspaces - Constrained Full Fine-Tuning for Continual Learning in LLMs

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.