Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Continual Post-Training for Large Language Models

Neural Magic via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This weekly AI seminar explores continual post-training for large language models, addressing the challenge of teaching LLMs new tasks without compromising existing knowledge. Learn about a practical method for post-training continual learning that enables full-model fine-tuning without increasing model size or degrading general capabilities. The presentation focuses on constraining updates to carefully selected low-rank subspaces, allowing models to adapt while preserving past knowledge. Access the related research paper, blog post, and code repository to implement these techniques in your own AI development work. Part of the "Random Samples" series that bridges cutting-edge AI research with practical applications for developers, data scientists, and researchers.

Syllabus

Random Samples: Continual Post-Training

Taught by

Neural Magic

Reviews

Start your review of Continual Post-Training for Large Language Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.