Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore Bayesian inference through the derivation of Maximum A Posteriori (MAP) estimates in this 59-minute lecture that focuses on continuous likelihood functions combined with prior distributions. Learn how to work with normal priors and normal error distributions for individual data observations in one-dimensional cases, and discover how the log-posterior function represents a weighted average between observed data and prior knowledge. Master the mathematical foundations of Bayesian statistical analysis by understanding the relationship between likelihood functions, prior distributions, and posterior estimates in practical data analysis scenarios.
Syllabus
UofU | Foundations of Data Analysis | Spring 2026 | L5: Bayesian Inference
Taught by
UofU Data Science