1. bookVolume 2019 (2019): Issue 2 (April 2019)
Journal Details
License
Format
Journal
First Published
16 Apr 2015
Publication timeframe
4 times per year
Languages
English
access type Open Access

Privacy Loss Classes: The Central Limit Theorem in Differential Privacy

Published Online: 04 May 2019
Page range: 245 - 269
Received: 31 Aug 2018
Accepted: 16 Dec 2018
Journal Details
License
Format
Journal
First Published
16 Apr 2015
Publication timeframe
4 times per year
Languages
English

Quantifying the privacy loss of a privacy-preserving mechanism on potentially sensitive data is a complex and well-researched topic; the de-facto standard for privacy measures are ε-differential privacy (DP) and its versatile relaxation (ε, δ)-approximate differential privacy (ADP). Recently, novel variants of (A)DP focused on giving tighter privacy bounds under continual observation. In this paper we unify many previous works via the privacy loss distribution (PLD) of a mechanism. We show that for non-adaptive mechanisms, the privacy loss under sequential composition undergoes a convolution and will converge to a Gauss distribution (the central limit theorem for DP). We derive several relevant insights: we can now characterize mechanisms by their privacy loss class, i.e., by the Gauss distribution to which their PLD converges, which allows us to give novel ADP bounds for mechanisms based on their privacy loss class; we derive exact analytical guarantees for the approximate randomized response mechanism and an exact analytical and closed formula for the Gauss mechanism, that, given ε, calculates δ, s.t., the mechanism is (ε, δ)-ADP (not an over-approximating bound).

Keywords

[1] “CoverUp Measurement Data,” http://e.mohammadi.eu/paper/coverup_measurements.zip, 2018, [Online].Search in Google Scholar

[2] M. Abadi, A. Chu, I. Goodfellow, H. B. McMahan, I. Mironov, K. Talwar, and L. Zhang, “Deep Learning with Differential Privacy,” in Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security (CCS). ACM, 2016, pp. 308–318.Search in Google Scholar

[3] M. Abramowitz and I. A. Stegun, Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, 1st ed. New York: Dover, 1972.Search in Google Scholar

[4] B. Balle, G. Barthe, and M. Gaboardi, “Privacy amplification by subsampling: Tight analyses via couplings and divergences,” in Neural Information Processing Systems (NIPS), 2018.Search in Google Scholar

[5] B. Balle and Y. Wang, “Improving the gaussian mechanism for differential privacy: Analytical calibration and optimal denoising,” in Proceedings of the 35th International Conference on Machine Learning (ICML), 2018, pp. 403–412.Search in Google Scholar

[6] P. Billingsley, Probability and measure. John Wiley & Sons, 2008.Search in Google Scholar

[7] V. I. Bogachev, Measure theory. Springer Science & Business Media, 2007, vol. 1.Search in Google Scholar

[8] M. Bun and T. Steinke, “Concentrated Differential Privacy: Simplifications, Extensions, and Lower Bounds,” in Theory of Cryptography (TCC). Springer, 2016, pp. 635–658.Search in Google Scholar

[9] I. Dinur and K. Nissim, “Revealing Information While Preserving Privacy,” in Proceedings of the Twenty-second ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems (PODS). ACM, 2003, pp. 202–210.Search in Google Scholar

[10] C. Dwork, K. Kenthapadi, F. McSherry, I. Mironov, and M. Naor, “Our Data, Ourselves: Privacy Via Distributed Noise Generation,” in Advances in Cryptology - EURO-CRYPT 2006. Springer, 2006, pp. 486–503.Search in Google Scholar

[11] C. Dwork and A. Roth, “The Algorithmic Foundations of Differential Privacy,” Foundations and Trends® in Theoretical Computer Science, vol. 9, no. 3–4, pp. 211–407, 2014.Search in Google Scholar

[12] C. Dwork and G. N. Rothblum, “Concentrated Differential Privacy,” CoRR, vol. abs/1603.01887, 2016.Search in Google Scholar

[13] U. Erlingsson, V. Pihur, and A. Korolova, “Rappor: Randomized aggregatable privacy-preserving ordinal response,” in Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security (CCS). ACM, 2014.Search in Google Scholar

[14] M. Götz, A. Machanavajjhala, G. Wang, X. Xiao, and J. Gehrke, “Privacy in search logs,” CoRR, vol. abs/0904.0682, 2009.Search in Google Scholar

[15] B. Gough, GNU Scientific Library Reference Manual - Third Edition, 3rd ed. Network Theory Ltd., 2009.Search in Google Scholar

[16] P. Kairouz, S. Oh, and P. Viswanath, “The composition theorem for differential privacy,” IEEE Transactions on Information Theory, vol. 63, no. 6, pp. 4037–4049, 2017.Search in Google Scholar

[17] J. W. Lindeberg, “Eine neue herleitung des exponentialgesetzes in der wahrscheinlichkeitsrechnung,” Mathematische Zeitschrift, vol. 15, no. 1, pp. 211–225, 1922.Search in Google Scholar

[18] A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke, and L. Vilhuber, “Privacy: Theory meets practice on the map,” in 2008 IEEE 24th International Conference on Data Engineering, April 2008, pp. 277–286.Search in Google Scholar

[19] S. Meiser, “Approximate and Probabilistic Differential Privacy Definitions,” https://eprint.iacr.org/2018/277, 2018.Search in Google Scholar

[20] S. Meiser and E. Mohammadi, “Tight on Budget? Tight Bounds for r-Fold Approximate Differential Privacy,” in Proceedings of the 25th ACM Conference on Computer and Communications Security (CCS). ACM, 2018.Search in Google Scholar

[21] I. Mironov, “Rényi Differential Privacy,” in Proceedings of the 30th IEEE Computer Security Foundations Symposium (CSF). IEEE, 2017, pp. 263–275.Search in Google Scholar

[22] J. Murtagh and S. Vadhan, “The complexity of computing the optimal composition of differential privacy,” in Proceedings, Part I, of the 13th International Conference on Theory of Cryptography (TCC). Springer, 2016, pp. 157–175.Search in Google Scholar

[23] I. Pinelis, “Chapter 4 - on the nonuniform berry–esseen bound,” in Inequalities and Extremal Problems in Probability and Statistics. Academic Press, 2017, pp. 103 – 138.Search in Google Scholar

[24] EU Regulation, “European data protection regulation (GDPR),” Off J Eur Union, vol. L119, pp. 1–88, 4th May 2016.Search in Google Scholar

[25] D. Sommer, A. Dhar, L. Malitsa, E. Mohammadi, D. Ronzani, and S. Capkun, “Anonymous Communication for Messengers via “Forced” Participation,” Technical report, available under https://eprint.iacr.org/2017/191, 2017.Search in Google Scholar

[26] S. Vadhan, G. N. Rothblum, and C. Dwork, “Boosting and differential privacy,” in 2010 IEEE 51st Annual Symposium on Foundations of Computer Science (FOCS), 2010, pp. 51–60.Search in Google Scholar

[27] J. van den Hooff, D. Lazar, M. Zaharia, and N. Zeldovich, “Vuvuzela: Scalable private messaging resistant to traffic analysis,” in Proceedings of the 25th Symposium on Operating Systems Principles (SOSP). ACM, 2015, pp. 137–152.Search in Google Scholar

[28] Y.-X. Wang, B. Balle, and S. Kasiviswanathan, “Subsampled Rényi differential privacy and analytical moments accountant,” arXiv preprint arXiv:1808.00087, 2018.Search in Google Scholar

Recommended articles from Trend MD

Plan your remote conference with Sciendo