Build with Azure OpenAI, Copilot Studio & Agentic Frameworks — Microsoft Certified
Live Online Classes in Design, Coding & AI — Small Classes, Free Retakes
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
This weekly AI seminar explores continual post-training for large language models, addressing the challenge of teaching LLMs new tasks without compromising existing knowledge. Learn about a practical method for post-training continual learning that enables full-model fine-tuning without increasing model size or degrading general capabilities. The presentation focuses on constraining updates to carefully selected low-rank subspaces, allowing models to adapt while preserving past knowledge. Access the related research paper, blog post, and code repository to implement these techniques in your own AI development work. Part of the "Random Samples" series that bridges cutting-edge AI research with practical applications for developers, data scientists, and researchers.
Syllabus
Random Samples: Continual Post-Training
Taught by
Neural Magic