Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Mirrored Langevin Dynamics - Ya-Ping Hsieh

Alan Turing Institute via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a 39-minute conference talk on Mirrored Langevin Dynamics presented by Ya-Ping Hsieh at the Alan Turing Institute. Delve into a unified framework for posterior sampling in constrained distributions, with a focus on Latent Dirichlet Allocation (LDA). Discover novel deterministic and stochastic first-order sampling schemes inspired by mirror descent. Learn about the improved convergence rate of O(epsilon^{-2}d) for general target distributions with strongly convex potential, significantly advancing the current state-of-the-art. Examine the specialized algorithm for sampling from Dirichlet posteriors, featuring the first non-asymptotic O(\epsilon^{-2}d^2 R_0) rate for first-order sampling. Explore the extension of the deterministic framework to mini-batch settings and its convergence rates with stochastic gradients. Gain insights into state-of-the-art experimental results for LDA on real datasets, bridging theoretical foundations with practical applications in statistics, probability, and optimization.

Syllabus

Mirrored Langevin Dynamics - Ya-Ping Hsieh

Taught by

Alan Turing Institute

Reviews

Start your review of Mirrored Langevin Dynamics - Ya-Ping Hsieh

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.