Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Static Posterior Inference of Bayesian Probabilistic Programming via Polynomial Solving

ACM SIGPLAN via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a groundbreaking automated approach for deriving guaranteed bounds on normalized posterior distributions in probabilistic programming through polynomial solving. Delve into the innovative method that handles programs with unbounded while loops and continuous distributions with infinite supports. Learn about the classification of programs into 'score-at-end' and 'score-recursive' categories, and understand how a fixed-point theorem and a multiplicative variant of the Optional Stopping Theorem are applied to infer bounds on normalized posterior distributions. Gain insights from the research presented by Peixin Wang, Hongfei Fu, Tengshun Yang, Guanyan Li, and C.-H. Luke Ong in this 11-minute conference talk from ACM SIGPLAN's LAFI'24 event.

Syllabus

[LAFI'24] Static Posterior Inference of Bayesian Probabilistic Programming via Polynomial ...

Taught by

ACM SIGPLAN

Reviews

Start your review of Static Posterior Inference of Bayesian Probabilistic Programming via Polynomial Solving

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.