Foundations of Data Visualization - Self Paced Online
You’re only 3 weeks away from a new language
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a 23-minute research presentation examining shared LoRA (Low-Rank Adaptation) subspaces as a solution for continual learning in artificial intelligence systems. Discover how researchers from Johns Hopkins University approach the challenge of enabling AI models to learn new tasks without forgetting previously acquired knowledge through innovative subspace sharing techniques. Learn about the mathematical foundations behind eigenvector analysis in AI contexts and understand how LoRA adaptations can be structured to maintain performance across multiple learning phases. Examine the practical implications of this approach for developing AI systems that can continuously adapt and expand their capabilities while preserving existing knowledge, addressing one of the fundamental challenges in machine learning known as catastrophic forgetting.
Syllabus
The Eigenvectors of AI: Shared LoRA Subspaces for Continual Learning
Taught by
Discover AI