Algorithms with More Granular Differential Privacy Guarantees

Authors Badih Ghazi, Ravi Kumar, Pasin Manurangsi, Thomas Steinke



PDF
Thumbnail PDF

File

LIPIcs.ITCS.2023.54.pdf
  • Filesize: 0.91 MB
  • 24 pages

Document Identifiers

Author Details

Badih Ghazi
  • Google, Mountain View, CA, USA
Ravi Kumar
  • Google, Mountain View, CA, USA
Pasin Manurangsi
  • Google, Mountain View, CA, USA
Thomas Steinke
  • Google, Mountain View, CA, USA

Cite As Get BibTex

Badih Ghazi, Ravi Kumar, Pasin Manurangsi, and Thomas Steinke. Algorithms with More Granular Differential Privacy Guarantees. In 14th Innovations in Theoretical Computer Science Conference (ITCS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 251, pp. 54:1-54:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023) https://doi.org/10.4230/LIPIcs.ITCS.2023.54

Abstract

Differential privacy is often applied with a privacy parameter that is larger than the theory suggests is ideal; various informal justifications for tolerating large privacy parameters have been proposed. In this work, we consider partial differential privacy (DP), which allows quantifying the privacy guarantee on a per-attribute basis. We study several basic data analysis and learning tasks in this framework, and design algorithms whose per-attribute privacy parameter is smaller that the best possible privacy parameter for the entire record of a person (i.e., all the attributes).

Subject Classification

ACM Subject Classification
  • Theory of computation → Theory of database privacy and security
Keywords
  • Differential Privacy
  • Algorithms
  • Per-Attribute Privacy

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. John M Abowd, Robert Ashmead, Ryan Cumings-Menon, Simson Garfinkel, Micah Heineck, Christine Heiss, Robert Johns, Daniel Kifer, Philip Leclerc, Ashwin Machanavajjhala, et al. The 2020 census disclosure avoidance system topdown algorithm. arXiv, 2022. URL: http://arxiv.org/abs/2204.08986.
  2. Jayadev Acharya, Kallista Bonawitz, Peter Kairouz, Daniel Ramage, and Ziteng Sun. Context aware local differential privacy. In ICML, pages 52-62, 2020. Google Scholar
  3. Faraz Ahmed, Alex X Liu, and Rong Jin. Social graph publishing with privacy guarantees. In ICDCS, pages 447-456, 2016. Google Scholar
  4. Mário Alvim, Konstantinos Chatzikokolakis, Catuscia Palamidessi, and Anna Pazii. Local differential privacy on metric spaces: optimizing the trade-off with utility. In CSF, pages 262-267, 2018. Google Scholar
  5. Miguel E Andrés, Nicolás E Bordenabe, Konstantinos Chatzikokolakis, and Catuscia Palamidessi. Geo-indistinguishability: Differential privacy for location-based systems. In CCS, pages 901-914, 2013. Google Scholar
  6. Hilal Asi, John Duchi, and Omid Javidbakht. Element level differential privacy: The right granularity of privacy. arXiv, 2019. URL: http://arxiv.org/abs/1912.04042.
  7. Victor Balcer and Albert Cheu. Separating local & shuffled differential privacy via histograms. In ITC, pages 1:1-1:14, 2020. Google Scholar
  8. Victor Balcer, Albert Cheu, Matthew Joseph, and Jieming Mao. Connecting robust shuffle privacy and pan-privacy. In SODA, pages 2384-2403, 2021. Google Scholar
  9. Raef Bassily, Adam Groce, Jonathan Katz, and Adam Smith. Coupled-worlds privacy: Exploiting adversarial uncertainty in statistical data privacy. In FOCS, pages 439-448, 2013. Google Scholar
  10. Raef Bassily and Adam D. Smith. Local, private, efficient protocols for succinct histograms. In STOC, pages 127-135, 2015. Google Scholar
  11. Amos Beimel, Kobbi Nissim, and Uri Stemmer. Characterizing the sample complexity of pure private learners. JMLR, 20:146:1-146:33, 2019. Google Scholar
  12. Raghav Bhaskar, Abhishek Bhowmick, Vipul Goyal, Srivatsan Laxman, and Abhradeep Thakurta. Noiseless database privacy. In ASIACRYPT, pages 215-232, 2011. Google Scholar
  13. Abhishek Bhowmick, John Duchi, Julien Freudiger, Gaurav Kapoor, and Ryan Rogers. Protection against reconstruction and its applications in private federated learning. arXiv, 2018. URL: http://arxiv.org/abs/1812.00984.
  14. Jaroslaw Błasiok, Mark Bun, Aleksandar Nikolov, and Thomas Steinke. Towards instance-optimal private query release. In SODA, pages 2480-2497, 2019. Google Scholar
  15. Mark Bun, Damien Desfontaines, Cynthia Dwork, Moni Naor, Kobbi Nissim, Aaron Roth, Adam Smith, Thomas Steinke, Jonathan Ullman, and Salil Vadhan. Statistical inference is not a privacy violation. DifferentialPrivacy.org, June 2021. URL: https://differentialprivacy.org/inference-is-not-a-privacy-violation/.
  16. Mark Bun, Jelani Nelson, and Uri Stemmer. Heavy hitters and the structure of local privacy. TALG, 15(4):1-40, 2019. Google Scholar
  17. Mark Bun, Kobbi Nissim, and Uri Stemmer. Simultaneous private learning of multiple concepts. JMLR, 20:94:1-94:34, 2019. Google Scholar
  18. Mark Bun and Thomas Steinke. Concentrated differential privacy: Simplifications, extensions, and lower bounds. In TCC, pages 635-658, 2016. Google Scholar
  19. Mark Bun, Jonathan Ullman, and Salil Vadhan. Fingerprinting codes and the price of approximate differential privacy. In STOC, pages 1-10, 2014. Google Scholar
  20. US Census Bureau. Census bureau sets key parameters to protect privacy in 2020 census results, 2021. URL: https://www.census.gov/newsroom/press-releases/2021/2020-census-key-parameters.html.
  21. US Census Bureau. Privacy-loss budget allocation 2021-06-08, 2021. URL: https://www2.census.gov/programs-surveys/decennial/2020/program-management/data-product-planning/2010-demonstration-data-products/01-Redistricting_File--PL_94-171/2021-06-08_ppmf_Production_Settings/2021-06-08-privacy-loss_budgetallocation.pdf.
  22. Nicholas Carlini, Steve Chien, Milad Nasr, Shuang Song, Andreas Terzis, and Florian Tramer. Membership inference attacks from first principles. In S & P, pages 1897-1914, 2022. Google Scholar
  23. Nicholas Carlini, Chang Liu, Úlfar Erlingsson, Jernej Kos, and Dawn Song. The secret sharer: Evaluating and testing unintended memorization in neural networks. In USENIX, pages 267-284, 2019. Google Scholar
  24. Konstantinos Chatzikokolakis, Miguel E Andrés, Nicolás Emilio Bordenabe, and Catuscia Palamidessi. Broadening the scope of differential privacy using metrics. In PETS, pages 82-102, 2013. Google Scholar
  25. Kamalika Chaudhuri and Daniel Hsu. Sample complexity bounds for differentially private learning. In COLT, pages 155-186, 2011. Google Scholar
  26. Kamalika Chaudhuri, Jacob Imola, and Ashwin Machanavajjhala. Capacity bounded differential privacy. In NeurIPS, pages 3469-3478, 2019. Google Scholar
  27. Rachel Cummings, Vitaly Feldman, Audra McMillan, and Kunal Talwar. Mean estimation with user-level privacy under data heterogeneity. In NeurIPS 2021 Workshop Privacy in Machine Learning, 2021. Google Scholar
  28. Damien Desfontaines. Demystifying the US census bureau’s reconstruction attack. https://desfontain.es/privacy/us-census-reconstruction-attack.html, May 2021. Ted is writing things (personal blog).
  29. Damien Desfontaines. A list of real-world uses of differential privacy, 2021. URL: https://desfontain.es/privacy/real-world-differential-privacy.html.
  30. Damien Desfontaines and Balázs Pejó. Sok: Differential privacies. PETS, 2:288-313, 2020. Google Scholar
  31. Ilias Diakonikolas, Daniel M. Kane, and Pasin Manurangsi. The complexity of adversarially robust proper learning of halfspaces with agnostic noise. In NeurIPS, 2020. Google Scholar
  32. Jinshuo Dong, Aaron Roth, and Weijie J Su. Gaussian differential privacy. arXiv, 2019. URL: http://arxiv.org/abs/1905.02383.
  33. Cynthia Dwork, Krishnaram Kenthapadi, Frank McSherry, Ilya Mironov, and Moni Naor. Our data, ourselves: Privacy via distributed noise generation. In TCC, pages 486-503, 2006. Google Scholar
  34. Cynthia Dwork and Jing Lei. Differential privacy and robust statistics. In STOC, pages 371-380, 2009. Google Scholar
  35. Cynthia Dwork, Frank McSherry, Kobbi Nissim, and Adam Smith. Calibrating noise to sensitivity in private data analysis. In TCC, pages 265-284, 2006. Google Scholar
  36. Cynthia Dwork, Aleksandar Nikolov, and Kunal Talwar. Using convex relaxations for efficiently and privately releasing marginals. In SoCG, pages 261-270, 2014. Google Scholar
  37. Cynthia Dwork, Aleksandar Nikolov, and Kunal Talwar. Efficient algorithms for privately releasing marginals via convex relaxations. Discrete & Computational Geometry, 53(3):650-673, 2015. Google Scholar
  38. Cynthia Dwork and Guy N Rothblum. Concentrated differential privacy. arXiv, 2016. URL: http://arxiv.org/abs/1603.01887.
  39. Cynthia Dwork, Adam Smith, Thomas Steinke, Jonathan Ullman, and Salil Vadhan. Robust traceability from trace amounts. In FOCS, pages 650-669, 2015. Google Scholar
  40. Úlfar Erlingsson, Vasyl Pihur, and Aleksandra Korolova. RAPPOR: randomized aggregatable privacy-preserving ordinal response. In CCS, pages 1054-1067, 2014. Google Scholar
  41. Vitaly Feldman and David Xiao. Sample complexity bounds on differentially private learning via communication complexity. SIAM J. Comput., 44(6):1740-1764, 2015. Google Scholar
  42. Badih Ghazi, Ravi Kumar, and Pasin Manurangsi. User-level differentially private learning via correlated sampling. NeurIPS, 2021. Google Scholar
  43. Badih Ghazi, Ravi Kumar, Pasin Manurangsi, and Thao Nguyen. Robust and private learning of halfspaces. In AISTATS, pages 1603-1611, 2021. Google Scholar
  44. Rob Hall, Alessandro Rinaldo, and Larry Wasserman. Random differential privacy. J. Priv. Confidentiality, 4(2), 2013. Google Scholar
  45. Moritz Hardt, Katrina Ligett, and Frank McSherry. A simple and practical algorithm for differentially private data release. In NIPS, 2012. Google Scholar
  46. Moritz Hardt and Guy N Rothblum. A multiplicative weights mechanism for privacy-preserving data analysis. In FOCS, pages 61-70, 2010. Google Scholar
  47. Moritz Hardt and Kunal Talwar. On the geometry of differential privacy. In STOC, pages 705-714, 2010. Google Scholar
  48. Michael Hay, Chao Li, Gerome Miklau, and David Jensen. Accurate estimation of the degree distribution of private networks. In ICDM, pages 169-178, 2009. Google Scholar
  49. Xi He, Ashwin Machanavajjhala, and Bolin Ding. Blowfish privacy: Tuning privacy-utility trade-offs using policies. In SIGMOD, pages 1447-1458, 2014. Google Scholar
  50. Florimond Houssiau, Luc Rocher, and Yves-Alexandre de Montjoye. On the difficulty of achieving differential privacy in practice: user-level guarantees in aggregate location data. Nature Communications, 13(1):1-3, 2022. Google Scholar
  51. Justin Hsu, Sanjeev Khanna, and Aaron Roth. Distributed private heavy hitters. In ICALP, pages 461-472, 2012. Google Scholar
  52. Matthew Jagielski, Jonathan Ullman, and Alina Oprea. Auditing differentially private machine learning: How private is private SGD? In NeurIPS, 2020. Google Scholar
  53. Vishesh Karwa and Salil Vadhan. Finite sample differentially private confidence intervals. In ITCS, pages 44:1-44:9, 2018. Google Scholar
  54. Shiva P Kasiviswanathan and Adam Smith. On the’semantics' of differential privacy: A Bayesian formulation. J. Priv. Confidentiality, 6(1), 2014. Google Scholar
  55. Krishnaram Kenthapadi, Aleksandra Korolova, Ilya Mironov, and Nina Mishra. Privacy via the Johnson-Lindenstrauss transform. J. Priv. Confidentiality, 5(1), 2013. Google Scholar
  56. Daniel Kifer and Ashwin Machanavajjhala. No free lunch in data privacy. In SIGMOD, pages 193-204, 2011. Google Scholar
  57. Daniel Kifer and Ashwin Machanavajjhala. Pufferfish: A framework for mathematical privacy definitions. TODS, 39(1):1-36, 2014. Google Scholar
  58. Aleksandra Korolova, Krishnaram Kenthapadi, Nina Mishra, and Alexandros Ntoulas. Releasing search queries and clicks privately. In WWW, pages 171-180, 2009. Google Scholar
  59. Fragkiskos Koufogiannis, Shuo Han, and George J Pappas. Optimality of the Laplace mechanism in differential privacy. arXiv, 2015. URL: http://arxiv.org/abs/1504.00065.
  60. Lucas Kowalczyk, Tal Malkin, Jonathan Ullman, and Daniel Wichs. Hardness of non-interactive differential privacy from one-way functions. In CRYPTO, pages 437-466, 2018. Google Scholar
  61. Daniel Levy, Ziteng Sun, Kareem Amin, Satyen Kale, Alex Kulesza, Mehryar Mohri, and Ananda Theertha Suresh. Learning with user-level privacy. In NeurIPS, 2021. Google Scholar
  62. Yuhan Liu, Ananda Theertha Suresh, Felix Yu, Sanjiv Kumar, and Michael Riley. Learning discrete distributions: user vs item-level privacy. In NeurIPS, 2020. Google Scholar
  63. Frank McSherry. Privacy integrated queries: an extensible platform for privacy-preserving data analysis. CACM, 53(9):89-97, 2010. Google Scholar
  64. Frank McSherry. Lunchtime for data privacy, 2016. URL: https://github.com/frankmcsherry/blog/blob/master/posts/2016-08-16.md.
  65. Frank McSherry and Kunal Talwar. Mechanism design via differential privacy. In FOCS, pages 94-103, 2007. Google Scholar
  66. Solomon Messing, Christina DeGregorio, Bennett Hillenbrand, Gary King, Saurav Mahanti, Zagreb Mukerjee, Chaya Nayak, Nate Persily, Bogdan State, and Arjun Wilkins. Facebook Privacy-Protected Full URLs Data Set, 2020. URL: https://doi.org/10.7910/DVN/TDOAPG.
  67. Ilya Mironov. Rényi differential privacy. In CSF, pages 263-275, 2017. Google Scholar
  68. Ilya Mironov, Omkant Pandey, Omer Reingold, and Salil Vadhan. Computational differential privacy. In CRYPTO, pages 126-142, 2009. Google Scholar
  69. Nina Mishra and Mark Sandler. Privacy via pseudorandom sketches. In PODS, pages 143-152, 2006. Google Scholar
  70. Aleksandar Nikolov, Kunal Talwar, and Li Zhang. The geometry of differential privacy: the approximate and sparse cases. In STOC, 2013. Google Scholar
  71. Kobbi Nissim, Sofya Raskhodnikova, and Adam Smith. Smooth sensitivity and sampling in private data analysis. In STOC, pages 75-84, 2007. Google Scholar
  72. Alfréd Rényi. On measures of entropy and information. In 4th Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics, pages 547-561, 1961. Google Scholar
  73. Reza Shokri, Marco Stronati, Congzheng Song, and Vitaly Shmatikov. Membership inference attacks against machine learning models. In S & P, pages 3-18, 2017. Google Scholar
  74. Thomas Steinke and Jonathan Ullman. The pitfalls of average-case differential privacy. DifferentialPrivacy.org, July 2020. URL: https://differentialprivacy.org/average-case-dp/.
  75. Thomas Steinke and Jonathan Ullman. Why privacy needs composition. DifferentialPrivacy.org, August 2020. URL: https://differentialprivacy.org/privacy-composition/.
  76. Jun Tang, Aleksandra Korolova, Xiaolong Bai, Xueqiang Wang, and Xiaofeng Wang. Privacy loss in Apple’s implementation of differential privacy on MacOS 10.12. arXiv, 2017. URL: http://arxiv.org/abs/1709.02753.
  77. Florian Tramer, Andreas Terzis, Thomas Steinke, Shuang Song, Matthew Jagielski, and Nicholas Carlini. Debugging differential privacy: A case study for privacy auditing. arXiv, 2022. URL: http://arxiv.org/abs/2202.12219.
  78. Aleksei Triastcyn and Boi Faltings. Bayesian differential privacy for machine learning. In ICML, pages 9583-9592, 2020. Google Scholar
  79. Michael Carl Tschantz, Shayak Sen, and Anupam Datta. Sok: Differential privacy as a causal property. In S & P, pages 354-371, 2020. Google Scholar
  80. Jonathan Ullman and Salil Vadhan. Pcps and the hardness of generating synthetic data. In TCC, pages 572-587, 2011. Google Scholar
  81. Yu-Xiang Wang. Per-instance differential privacy. J. Priv. Confidentiality, 9(1), 2019. Google Scholar
  82. Jun Zhang, Xiaokui Xiao, and Xing Xie. Privtree: A differentially private algorithm for hierarchical decompositions. In SIGMOD, pages 155-170, 2016. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail