Abstract
Concerns related to data security and confidentiality have been raised when applying machine learning to real-world applications. Differential privacy provides a principled and rigorous privacy guarantee for machine learning models. While it is common to inject noise to design a model satisfying a required differential-privacy property, it is generally hard to balance the trade-off between privacy and utility. We show that stochastic gradient Markov chain Monte Carlo (SG-MCMC) - a class of scalable Bayesian posterior sampling algorithms - satisfies strong differential privacy, when carefully chosen stepsizes are employed. We develop theory on the performance of the proposed differentially-private SG-MCMC method. We conduct experiments to support our analysis, and show that a standard SG-MCMC sampler with minor modification can reach state-of-the-art performance in terms of both privacy and utility on Bayesian learning.
| Original language | English |
|---|---|
| State | Published - 2020 |
| Event | 22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019 - Naha, Japan Duration: Apr 16 2019 → Apr 18 2019 |
Conference
| Conference | 22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019 |
|---|---|
| Country/Territory | Japan |
| City | Naha |
| Period | 04/16/19 → 04/18/19 |
Fingerprint
Dive into the research topics of 'On connecting stochastic gradient MCMC and differential privacy'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver