Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Scalable Algorithms for Bayesian Deep Learning via Stochastic Gradient Monte Carlo and Beyond

Institute for Pure & Applied Mathematics (IPAM) via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn about scalable algorithms for Bayesian deep learning through stochastic gradient Monte Carlo methods in this 32-minute conference talk. Explore the challenges of applying replica exchange Monte Carlo (reMC) to big data scenarios and discover how traditional parallel tempering techniques struggle with scalability when requiring full dataset evaluations. Understand the limitations of naïve mini-batch implementations that introduce significant biases and cannot be directly extended to stochastic gradient MCMC (SGMCMC) methods commonly used for deep neural networks. Examine an innovative adaptive replica exchange SGMCMC (reSGMCMC) approach that automatically corrects bias while maintaining computational efficiency. Analyze the theoretical properties of this method, including the acceleration-accuracy trade-off inherent in numerical discretization of Markov jump processes within stochastic environments. Review comprehensive experimental results demonstrating state-of-the-art performance on CIFAR10, CIFAR100, and SVHN datasets across both supervised and semi-supervised learning tasks, showcasing the practical effectiveness of these advanced sampling techniques for Bayesian deep learning applications.

Syllabus

Guang Lin - Scalable algorithms for Bayesian deep learning via stochastic gradient Monte Carlo

Taught by

Institute for Pure & Applied Mathematics (IPAM)

Reviews

Start your review of Scalable Algorithms for Bayesian Deep Learning via Stochastic Gradient Monte Carlo and Beyond

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.