Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

University of Colorado Boulder

Introduction to Bayesian Statistics for Data Science

University of Colorado Boulder via Coursera

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This course introduces the theoretical, philosophical, and mathematical foundations of Bayesian Statistical inference. Students will learn to apply this foundational knowledge to real-world data science problems. Topics include the use and interpretations of probability theory in Bayesian inference; Bayes’ theorem for statistical parameters; conjugate, improper, and objective priors distributions; data science applications of Bayesian inference; and ethical implications of Bayesian statistics. This course can be taken for academic credit as part of CU Boulder’s Master of Science in Data Science (MS-DS) degree offered on the Coursera platform. The MS-DS is an interdisciplinary degree that brings together faculty from CU Boulder’s departments of Applied Mathematics, Computer Science, Information Science, and others. With performance-based admissions and no application process, the MS-DS is ideal for individuals with a broad range of undergraduate education and/or professional experience in computer science, information science, mathematics, and statistics. Learn more about the MS-DS program at https://www.coursera.org/degrees/master-of-science-data-science-boulder.

Syllabus

  • Philosophical Underpinnings of Bayesian Statistics
    • This module introduces learners to Bayesian statistics by comparing Bayesian and frequentist methods. The introduction is motivated by an example that illustrates how different assumptions about data collection - specifically, stopping rules - can result in different conclusions when using frequentist methods. Bayesian methods, on the other hand, yield the same conclusion regardless of stopping rules. This example illuminates a key philosophical difference between frequentist and Bayesian methods.
  • Introduction to Bayesian Inference and Prediction
    • This module introduces learners to Bayesian inference through an example using discrete data. The example demonstrates how the posterior distribution is calculated and how uncertainty is quantified in Bayesian statistics. The module also describes methods for summarizing the posterior distribution and introduces learners to the posterior predictive distribution through use of the Monte Carlo simulation. Monte Carlo simulations will be important for advanced computational Bayesian methods.
  • Introduction to Conjugate Families
    • This module introduces learners to methods for conducting Bayesian inference when the likelihood and prior distributions come from a convenient family of distributions, called conjugate families. Conjugate families are a class of prior distributions for which the posterior distribution is in the same class. The module covers the beta-binomial, normal-normal and inverse gamma-normal conjugate families and includes examples of their application to find posterior distributions in R.
  • Improper and Objective Priors
    • This module motivates, defines, and utilizes improper and so-called "objective" prior distributions in Bayesian statistical inference.
  • Multiparameter Inference
    • In this module, learners will be introduced to Bayesian inference involving more than one unknown parameter. Multiparameter problems are motivated with a simple example: a conjugate prior, two-parameter model involving normally distributed data. From there, we learn to solve more complex problems, including Bayesian linear regression and variance-covariance matrix estimation.

Taught by

Brian Zaharatos

Reviews

Start your review of Introduction to Bayesian Statistics for Data Science

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.