Abstract
Many applications of machine learning, for example in health care, would benefit from methods that can guarantee privacy of data subjects. Differential privacy (DP) has become established as a standard for protecting learning results. The standard DP algorithms require a single trusted party to have access to the entire data, which is a clear weakness, or add prohibitive amounts of noise. We consider DP Bayesian learning in a distributed setting, where each party only holds a single sample or a few samples of the data. We propose a learning strategy based on a secure multi-party sum function for aggregating summaries from data holders and the Gaussian mechanism for DP. Our method builds on an asymptotically optimal and practically efficient DP Bayesian inference with rapidly diminishing extra cost.
Original language | English |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 30 |
Subtitle of host publication | Proceedings of NIPS 2017 |
Publisher | Curran Associates, Inc. |
Pages | 3227-3236 |
Number of pages | 10 |
Publication status | Published - 2017 |
MoE publication type | A4 Article in a conference publication |
Event | IEEE Conference on Neural Information Processing Systems - Long Beach, United States Duration: 4 Dec 2017 → 9 Dec 2017 Conference number: 31 |
Publication series
Name | Advances in Neural Information Processing Systems |
---|---|
Publisher | Curran Associates |
Volume | 30 |
ISSN (Print) | 1049-5258 |
Conference
Conference | IEEE Conference on Neural Information Processing Systems |
---|---|
Abbreviated title | NIPS |
Country | United States |
City | Long Beach |
Period | 04/12/2017 → 09/12/2017 |