Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore Bayesian inference through the derivation of Maximum A Posteriori (MAP) estimates in this 59-minute lecture that focuses on continuous likelihood functions combined with prior distributions. Learn how to work with normal priors and normal error distributions for individual data observations in one-dimensional cases, and discover how the log-posterior function represents a weighted average between observed data and prior knowledge. Master the mathematical foundations of Bayesian statistical analysis by understanding the relationship between likelihood functions, prior distributions, and posterior estimates in practical data analysis scenarios.
Syllabus
UofU | Foundations of Data Analysis | Spring 2026 | L5: Bayesian Inference
Taught by
UofU Data Science