Differentially Private M-Estimation via Noisy Optimization
USC Probability and Statistics Seminar via YouTube
Launch a New Career with Certificates from Google, IBM & Microsoft
Free courses from frontend to fullstack and AI
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a 48-minute research seminar from the USC Probability and Statistics series where Po-Ling Loh presents groundbreaking work on differentially private statistical estimation using noisy composite gradient descent algorithms in high dimensions. Learn about convergence rates for parameter error under local restricted strong convexity and smoothness conditions, with a focus on maintaining Gaussian differential privacy through multivariate noise addition. Discover applications in linear regression and mean estimation, examining M-estimators that effectively downweight individual data points to ensure bounded gradient sensitivity without requiring bounded data domains. Understand how these private estimators enable differentially private confidence intervals for regression coefficients through Lasso debiasing techniques, supported by simulation results demonstrating practical effectiveness. The presentation covers collaborative research conducted with Marco Avella-Medina, Casey Bradshaw, and Zheng Liu.
Syllabus
Po-Ling Loh: Differentially private M-estimation via noisy optimization (University of Cambridge)
Taught by
USC Probability and Statistics Seminar