Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Bayesian Continual Learning and Forgetting in Neural Networks

IPhT-TV via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn about Bayesian approaches to continual learning in neural networks through this 35-minute conference talk that explores how networks can acquire new knowledge while managing the challenge of catastrophic forgetting. Discover the theoretical foundations of Bayesian continual learning, where uncertainty quantification plays a crucial role in determining which parameters to preserve from previous tasks and which can be adapted for new learning scenarios. Examine practical implementations of Bayesian methods that enable neural networks to maintain performance on previously learned tasks while successfully adapting to new domains or datasets. Understand how probabilistic frameworks can be leveraged to create more robust and flexible learning systems that better mimic human-like learning capabilities, where new knowledge builds upon rather than overwrites existing understanding. Explore the mathematical underpinnings of these approaches and their applications in real-world scenarios where sequential learning is essential.

Syllabus

Bayesian continual learning and forgetting in neural networks - Kellian COTTART

Taught by

IPhT-TV

Reviews

Start your review of Bayesian Continual Learning and Forgetting in Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.