Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a 48-minute research seminar from the USC Probability and Statistics series where Po-Ling Loh presents groundbreaking work on differentially private statistical estimation using noisy composite gradient descent algorithms in high dimensions. Learn about convergence rates for parameter error under local restricted strong convexity and smoothness conditions, with a focus on maintaining Gaussian differential privacy through multivariate noise addition. Discover applications in linear regression and mean estimation, examining M-estimators that effectively downweight individual data points to ensure bounded gradient sensitivity without requiring bounded data domains. Understand how these private estimators enable differentially private confidence intervals for regression coefficients through Lasso debiasing techniques, supported by simulation results demonstrating practical effectiveness. The presentation covers collaborative research conducted with Marco Avella-Medina, Casey Bradshaw, and Zheng Liu.