Probabilistic Operator Learning - Generative Modeling and Uncertainty Quantification
Institute for Pure & Applied Mathematics (IPAM) via YouTube
The Fastest Way to Become a Backend Developer Online
PowerBI Data Analyst - Create visualizations and dashboards from scratch
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a 40-minute conference talk that introduces a probabilistic framework for in-context operator learning networks (ICON) and their application to scientific machine learning. Discover how ICON methods, based on foundation model architectures, learn to map example condition-solution pairs of differential equations to solution operator approximations through training on diverse datasets of initial and boundary conditions paired with corresponding ODE and PDE solutions. Learn about the theoretical foundation that reveals ICON as implicitly performing Bayesian inference, computing the mean of posterior predictive distributions over solution operators conditioned on provided context examples. Understand how modeling the dependence of example pairs through random differential equations formalizes ICON as a point estimate of posterior predictive distributions over operators. Examine the extension of this probabilistic perspective to generative settings, enabling sampling from posterior predictive distributions over solution operators rather than being limited to point predictions. Gain insights into how this approach captures underlying uncertainty in solution operators and enables principled uncertainty quantification in operator learning through generative modeling to produce confidence intervals for predictive solutions.
Syllabus
Benjamin Zhang - Probabilistic operator learning: generative modeling and uncertainty quantification
Taught by
Institute for Pure & Applied Mathematics (IPAM)