Counterfactual Inference with Unobserved Confounding via Exponential Family
Harvard CMSA via YouTube
AI, Data Science & Business Certificates from Google, IBM & Microsoft
Stuck in Tutorial Hell? Learn Backend Dev the Right Way
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Watch a 44-minute lecture from Harvard CMSA featuring MIT professor Devavrat Shah exploring counterfactual inference with unobserved confounding through exponential family modeling. Learn about personalized decision-making challenges in recommender systems, focusing on how to infer user engagement with different recommendation sequences while accounting for unobserved factors. Discover a computationally efficient method for learning distribution parameters with estimation error scaling linearly with metric entropy. Explore sufficient conditions for compactly supported distributions satisfying logarithmic Sobolev inequality, and understand the application of these concepts in sequential recommender systems, measurement error imputation, and undirected graphical models. The lecture covers theoretical foundations including maximum likelihood estimation, proper loss functions, and parameter estimation techniques for handling heterogeneous users with single trajectory observations.
Syllabus
Intro
Sequential Recommender System
Challenges
Problem Setup
Our Approach
Inference Tasks
An Application Imputing Measurement Error
Remainder Of The Talk The Loss Function, And Why It Works
Maximum Likelihood Estimation
An Alternative
A Proper Loss Function
Proof
Undirected Graphical Model
Back To Our Setting
In Summary: Parameter Estimation_
Taught by
Harvard CMSA