Gain a Splash of New Skills - Coursera+ Annual Nearly 45% Off
Our career paths help you become job ready faster
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn about Bayesian approaches to continual learning in neural networks through this 35-minute conference talk that explores how networks can acquire new knowledge while managing the challenge of catastrophic forgetting. Discover the theoretical foundations of Bayesian continual learning, where uncertainty quantification plays a crucial role in determining which parameters to preserve from previous tasks and which can be adapted for new learning scenarios. Examine practical implementations of Bayesian methods that enable neural networks to maintain performance on previously learned tasks while successfully adapting to new domains or datasets. Understand how probabilistic frameworks can be leveraged to create more robust and flexible learning systems that better mimic human-like learning capabilities, where new knowledge builds upon rather than overwrites existing understanding. Explore the mathematical underpinnings of these approaches and their applications in real-world scenarios where sequential learning is essential.
Syllabus
Bayesian continual learning and forgetting in neural networks - Kellian COTTART
Taught by
IPhT-TV