Get 20% off all career paths from fullstack to AI
Python, Prompt Engineering, Data Science — Build the Skills Employers Want Now
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the Bernstein-von Mises theorem in the context of semiparametric mixture models through this mathematical statistics seminar presented by Dr. Stefan Franssen from CNRS, Université Sorbonne Paris Nord. Delve into the theoretical foundations of Bayesian asymptotic theory as it applies to mixture distributions where some components are parametric while others remain nonparametric. Examine how the classical Bernstein-von Mises theorem, which establishes the asymptotic normality of posterior distributions, extends to these more complex semiparametric settings. Learn about the technical challenges that arise when dealing with infinite-dimensional nuisance parameters in mixture models and discover the conditions under which Bayesian inference maintains its desirable asymptotic properties. Gain insights into the interplay between parametric and nonparametric components in statistical modeling and understand the implications for uncertainty quantification in modern statistical applications. This presentation is part of the "Representing, calibrating & leveraging prediction uncertainty from statistics to machine learning" program at the Isaac Newton Institute.
Syllabus
Date: 12th Aug 2025 - 10:30 to 11:30
Taught by
INI Seminar Room 2