Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Guaranteed Bounds on Posterior Distributions of Discrete Probabilistic Programs with Loops

ACM SIGPLAN via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This video presents a research talk from POPL 2025 that explores novel approaches for bounding posterior distributions in discrete probabilistic programs containing loops and conditioning. Discover two groundbreaking methods for addressing the challenges of Bayesian inference in programs with unbounded support: the residual mass semantics approach, which provides flat bounds based on residual probability mass, and the geometric bound semantics approach, which works with eventually geometric distributions using contraction invariants. Learn how these fully automated techniques provide guaranteed bounds that sandwich true distributions, offering more provable guarantees than sampling-based inference methods. The presentation covers theoretical properties including soundness and convergence, as well as Diabolo, a practical implementation evaluated on various benchmarks. The research demonstrates how to bound not just probabilities but also moments and tail asymptotics, particularly valuable for programs where exact Bayesian inference is challenging. Presented by researchers from the University of Oxford and Nanyang Technological University, this 20-minute talk includes access to the published article and supplementary materials with reusable artifacts.

Syllabus

[POPL'25] Guaranteed Bounds on Posterior Distributions of Discrete Probabilistic Programs with Loops

Taught by

ACM SIGPLAN

Reviews

Start your review of Guaranteed Bounds on Posterior Distributions of Discrete Probabilistic Programs with Loops

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.