Build with Azure OpenAI, Copilot Studio & Agentic Frameworks — Microsoft Certified
Google Data Analytics, IBM AI & Meta Marketing — All in One Subscription
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a groundbreaking system for scalable privacy-preserving machine learning in this IEEE conference talk. Delve into new and efficient protocols for privacy-preserving linear regression, logistic regression, and neural network training using stochastic gradient descent. Discover innovative techniques for secure arithmetic operations on shared decimal numbers and MPC-friendly alternatives to nonlinear functions. Learn how this system, implemented in C++, outperforms state-of-the-art implementations for privacy-preserving linear and logistic regressions, scaling to millions of data samples with thousands of features. Gain insights into the first privacy-preserving system for training neural networks, addressing the critical balance between data utility and privacy concerns in modern machine learning applications.
Syllabus
Intro
Privacy-preserving Machine Learning
Our Contributions
Privacy-preserving Linear Regression
Decimal Multiplications in Integer Fields
Truncation on shared values
Effects of Our Technique
Privacy-preserving Logistic Regression
Experiments Results: Linear Regression
Experiments Results: Logistic Regression
Experiments: Neural Networks
Summary
Taught by
IEEE Symposium on Security and Privacy