The Power of Synergy in Differential Privacy: Combining a Small Curator with Local Randomizers

Authors Amos Beimel , Aleksandra Korolova, Kobbi Nissim , Or Sheffet , Uri Stemmer

Thumbnail PDF


  • Filesize: 0.88 MB
  • 25 pages

Document Identifiers

Author Details

Amos Beimel
  • Dept. of Computer Science, Ben-Gurion University, Beer-Sheva, Israel
Aleksandra Korolova
  • Dept. of Computer Science, University of Southern California, Los Angeles, CA, USA
Kobbi Nissim
  • Dept. of Computer Science, Georgetown University, Washington, DC, USA
Or Sheffet
  • Faculty of Engineering, Bar-Ilan University, Ramat Gan, Israel
Uri Stemmer
  • Dept. of Computer Science, Ben-Gurion University, Beer-Sheva, Israel
  • Google Research


We thank Adam Smith for suggesting the select-then-estimate task discussed in the introduction.

Cite AsGet BibTex

Amos Beimel, Aleksandra Korolova, Kobbi Nissim, Or Sheffet, and Uri Stemmer. The Power of Synergy in Differential Privacy: Combining a Small Curator with Local Randomizers. In 1st Conference on Information-Theoretic Cryptography (ITC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 163, pp. 14:1-14:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Motivated by the desire to bridge the utility gap between local and trusted curator models of differential privacy for practical applications, we initiate the theoretical study of a hybrid model introduced by "Blender" [Avent et al., USENIX Security '17], in which differentially private protocols of n agents that work in the local-model are assisted by a differentially private curator that has access to the data of m additional users. We focus on the regime where m ≪ n and study the new capabilities of this (m,n)-hybrid model. We show that, despite the fact that the hybrid model adds no significant new capabilities for the basic task of simple hypothesis-testing, there are many other tasks (under a wide range of parameters) that can be solved in the hybrid model yet cannot be solved either by the curator or by the local-users separately. Moreover, we exhibit additional tasks where at least one round of interaction between the curator and the local-users is necessary - namely, no hybrid model protocol without such interaction can solve these tasks. Taken together, our results show that the combination of the local model with a small curator can become part of a promising toolkit for designing and implementing differential privacy.

Subject Classification

ACM Subject Classification
  • Security and privacy → Privacy-preserving protocols
  • differential privacy
  • hybrid model
  • private learning
  • local model


  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    PDF Downloads


  1. Apple. Learning with privacy at scale. Apple Machine Learning Journal, 1, 2017. URL:
  2. Brendan Avent, Aleksandra Korolova, David Zeber, Torgeir Hovden, and Benjamin Livshits. BLENDER: Enabling local search with a hybrid differential privacy model. In 26th USENIX Security Symposium (USENIX Security 17), pages 747-764. USENIX Association, 2017. URL:
  3. C.-A. Azencott. Machine learning and genomics: precision medicine versus patient privacy. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 376(2128):20170350, 2018. Google Scholar
  4. Mitali Bafna and Jonathan Ullman. The price of selection in differential privacy. In Satyen Kale and Ohad Shamir, editors, Proceedings of the 2017 Conference on Learning Theory, volume 65 of Proceedings of Machine Learning Research, pages 151-168. PMLR, 2017. Google Scholar
  5. Borja Balle, James Bell, Adrià Gascón, and Kobbi Nissim. Differentially private summation with multi-message shuffling. CoRR, abs/1906.09116, 2019. URL:
  6. Borja Balle, James Bell, Adrià Gascón, and Kobbi Nissim. The privacy blanket of the shuffle model. In Alexandra Boldyreva and Daniele Micciancio, editors, Advances in Cryptology - CRYPTO 2019 - 39th Annual International Cryptology Conference, Santa Barbara, CA, USA, August 18-22, 2019, Proceedings, Part II, volume 11693 of Lecture Notes in Computer Science, pages 638-667. Springer, 2019. URL:
  7. Raef Bassily, Kobbi Nissim, Adam D. Smith, Thomas Steinke, Uri Stemmer, and Jonathan Ullman. Algorithmic stability for adaptive data analysis. In Proceedings of the 48th Annual ACM SIGACT Symposium on Theory of Computing, STOC 2016, pages 1046-1059, 2016. URL:
  8. Raef Bassily, Kobbi Nissim, Uri Stemmer, and Abhradeep Guha Thakurta. Practical locally private heavy hitters. In Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, pages 2288-2296, 2017. URL:
  9. Raef Bassily and Adam D. Smith. Local, private, efficient protocols for succinct histograms. In Proceedings of the Forty-Seventh Annual ACM on Symposium on Theory of Computing, STOC 2015, pages 127-135, 2015. URL:
  10. Amos Beimel, Aleksandra Korolova, Kobbi Nissim, Or Sheffet, and Uri Stemmer. The power of synergy in differential privacy: Combining a small curator with local randomizers. CoRR, abs/1912.08951, 2019. URL:
  11. Amos Beimel, Kobbi Nissim, and Eran Omri. Distributed private data analysis: Simultaneously solving how and what. In Advances in Cryptology - CRYPTO 2008, 28th Annual International Cryptology Conference, pages 451-468, 2008. URL:
  12. Amos Beimel, Kobbi Nissim, and Uri Stemmer. Characterizing the sample complexity of private learners. In ITCS, pages 97-110. ACM, 2013. Google Scholar
  13. Amos Beimel, Kobbi Nissim, and Uri Stemmer. Private learning and sanitization: Pure vs. approximate differential privacy. Theory of Computing, 12(1):1-61, 2016. URL:
  14. Bonnie Berger and Hyunghoon Cho. Emerging technologies towards enhancing privacy in genomic data sharing. Genome Biology, 20, 2019. URL:
  15. Andrea Bittau, Úlfar Erlingsson, Petros Maniatis, Ilya Mironov, Ananth Raghunathan, David Lie, Mitch Rudominer, Ushasree Kode, Julien Tinnes, and Bernhard Seefeld. Prochlo: Strong privacy for analytics in the crowd. In Proceedings of the 26th Symposium on Operating Systems Principles, SOSP '17, pages 441-459. ACM, 2017. URL:
  16. Charlotte Bonte, Eleftheria Makri, Amin Ardeshirdavani, Jaak Simm, Yves Moreau, and Frederik Vercauteren. Towards practical privacy-preserving genome-wide association study. BMC bioinformatics, 19(1):537, 2018. Google Scholar
  17. Mark Bun, Jelani Nelson, and Uri Stemmer. Heavy hitters and the structure of local privacy. In Jan Van den Bussche and Marcelo Arenas, editors, Proceedings of the 37th ACM SIGMOD-SIGACT-SIGAI Symposium on Principles of Database Systems, pages 435-447. ACM, 2018. URL:
  18. Mark Bun, Kobbi Nissim, Uri Stemmer, and Salil P. Vadhan. Differentially private release and learning of threshold functions. In FOCS, pages 634-649, 2015. Google Scholar
  19. Clément L. Canonne, Gautam Kamath, Audra McMillan, Adam D. Smith, and Jonathan Ullman. The structure of optimal private tests for simple hypotheses. In STOC, pages 310-321, 2019. Google Scholar
  20. Albert Cheu, Adam D. Smith, Jonathan Ullman, David Zeber, and Maxim Zhilyaev. Distributed differential privacy via shuffling. In Yuval Ishai and Vincent Rijmen, editors, Advances in Cryptology - EUROCRYPT 2019 - 38th Annual International Conference on the Theory and Applications of Cryptographic Techniques, Darmstadt, Germany, May 19-23, 2019, Proceedings, Part I, volume 11476 of Lecture Notes in Computer Science, pages 375-403. Springer, 2019. URL:
  21. Catalin Cimpanu. Capital One hacker took data from more than 30 companies, new court docs reveal. ZDNet, August 14, 2019. URL:
  22. Privacy Rights Clearinghouse. Data Breaches, 2019. URL:
  23. Bolin Ding, Janardhan Kulkarni, and Sergey Yekhanin. Collecting telemetry data privately. In I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems 30, pages 3571-3580. Curran Associates, Inc., 2017. URL:
  24. Aryeh Dvoretzky, Jack Kiefer, and Jacob Wolfowitz. Asymptotic minimax character of the sample distribution function and of the classical multinomial estimator. The Annals of Mathematical Statistics, 27(3):642-669, 1956. Google Scholar
  25. Cynthia Dwork, Vitaly Feldman, Moritz Hardt, Toniann Pitassi, Omer Reingold, and Aaron Roth. Preserving statistical validity in adaptive data analysis. In ACM Symposium on the Theory of Computing (STOC). ACM, June 2015. Google Scholar
  26. Cynthia Dwork, Frank McSherry, Kobbi Nissim, and Adam Smith. Calibrating noise to sensitivity in private data analysis. In TCC, volume 3876 of Lecture Notes in Computer Science, pages 265-284. Springer, 2006. Google Scholar
  27. Úlfar Erlingsson, Vitaly Feldman, Ilya Mironov, Ananth Raghunathan, Kunal Talwar, and Abhradeep Thakurta. Amplification by shuffling: From local to central differential privacy via anonymity. In Proceedings of the Thirtieth Annual ACM-SIAM Symposium on Discrete Algorithms, SODA '19, pages 2468-2479. Society for Industrial and Applied Mathematics, 2019. URL:
  28. Úlfar Erlingsson, Vasyl Pihur, and Aleksandra Korolova. RAPPOR: Randomized aggregatable privacy-preserving ordinal response. In Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security, CCS '14, pages 1054-1067, New York, NY, USA, 2014. ACM. URL:
  29. Vitaly Feldman and Thomas Steinke. Generalization for adaptively-chosen estimators via stable median. In Proceedings of the 30th Conference on Learning Theory, COLT 2017, pages 728-757, 2017. URL:
  30. Vitaly Feldman and David Xiao. Sample complexity bounds on differentially private learning via communication complexity. SIAM J. Comput., 44(6):1740-1764, 2015. URL:
  31. Marco Gaboardi, Ryan Rogers, and Or Sheffet. Locally private mean estimation: Z-test and tight confidence intervals. CoRR, abs/1810.08054, 2018. URL:
  32. Badih Ghazi, Rasmus Pagh, and Ameya Velingker. Scalable and differentially private distributed aggregation in the shuffled model. CoRR, abs/1906.08320, 2019. URL:
  33. Jeremy Ginsberg, Matthew H Mohebbi, Rajan S Patel, Lynnette Brammer, Mark S Smolinski, and Larry Brilliant. Detecting influenza epidemics using search engine query data. Nature, 457(7232):1012-1014, 2009. Google Scholar
  34. Google Research Blog. Tackling urban mobility with technology, Nov 18, 2015. URL:
  35. Andy Greenberg. Apple’s differential privacy is about collecting your data - but not your data. In Wired, June 13, 2016. Google Scholar
  36. Andy Greenberg. How one of Apple’s key privacy safeguards falls short. WIRED, Sep 15, 2017. URL:
  37. Alex Hern. Uber employees ’spied on ex-partners, politicians and Beyoncé'. The Guardian, Dec 13, 2016. URL:
  38. Yuval Ishai, Eyal Kushilevitz, Rafail Ostrovsky, and Amit Sahai. Cryptography from anonymity. In 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS 2006), 21-24 October 2006, Berkeley, California, USA, Proceedings, pages 239-248. IEEE Computer Society, 2006. URL:
  39. Aaron Johnson and Vitaly Shmatikov. Privacy-preserving data exploration in genome-wide association studies. In Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining, pages 1079-1087. ACM, 2013. Google Scholar
  40. Matthew Joseph, Jieming Mao, Seth Neel, and Aaron Roth. The role of interactivity in local differential privacy. CoRR, abs/1904.03564, 2019. URL:
  41. Matthew Joseph, Jieming Mao, and Aaron Roth. Exponential separations in local differential privacy through communication complexity. CoRR, abs/1907.00813, 2019. URL:
  42. Matthew Joseph, Aaron Roth, Jonathan Ullman, and Bo Waggoner. Local differential privacy for evolving data. In Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems 2018, NeurIPS 2018, pages 2381-2390, 2018. URL:
  43. Shiva Prasad Kasiviswanathan, Homin K. Lee, Kobbi Nissim, Sofya Raskhodnikova, and Adam D. Smith. What can we learn privately? SIAM J. Comput., 40(3):793-826, 2011. URL:
  44. Aaron Mak. Do Tech Companies Really Need to Snoop Into Private Conversations to Improve Their A.I.? Slate, Feb 19, 2019. URL:
  45. Pascal Massart. The tight constant in the Dvoretzky-Kiefer-Wolfowitz inequality. The Annals of Probability, pages 1269-1283, 1990. Google Scholar
  46. Andrew McGregor, Ilya Mironov, Toniann Pitassi, Omer Reingold, Kunal Talwar, and Salil P. Vadhan. The limits of two-party differential privacy. In 51th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2010, pages 81-90. IEEE Computer Society, 2010. URL:
  47. Chris Merriman. Microsoft reminds privacy-concerned Windows 10 beta testers that they're volunteers. In The Inquirer,, Oct 7, 2014. URL:
  48. Microsoft. Windows Insider Program Agreement, Sep 15, 2017. URL:
  49. Ilya Mironov. Beyond software: Hardware-based security, May 8, 2019. URL:
  50. Mozilla. Firefox Privacy Notice, June 4, 2019. URL:
  51. Kobbi Nissim and Uri Stemmer, 2017. Personal communication. Google Scholar
  52. NYC Department of Transportation. New York City Mobility Report, 2016. URL:
  53. Phillip Rogaway. The moral character of cryptographic work. IACR Cryptology ePrint Archive, 2015:1162, 2015. Google Scholar
  54. Stephen Shankland. How Google tricks itself to protect Chrome user privacy. In CNET, Oct 31, 2014. Google Scholar
  55. Elaine Shi, T.-H. Hubert Chan, Eleanor G. Rieffel, and Dawn Song. Distributed private data analysis: Lower bounds and practical constructions. ACM Trans. Algorithms, 13(4):50:1-50:38, 2017. URL:
  56. Sean Simmons and Bonnie Berger. Realizing privacy preserving genome-wide association studies. Bioinformatics, 32(9):1293-1300, 2016. Google Scholar
  57. Thomas Steinke and Jonathan Ullman. Tight lower bounds for differentially private selection. In 2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS), pages 552-563. IEEE, 2017. Google Scholar
  58. Uri Stemmer. Locally private k-means clustering. In SODA, pages 548-559. SIAM, 2020. Google Scholar
  59. Jun Tang, Aleksandra Korolova, Xiaolong Bai, Xueqiang Wang, and XiaoFeng Wang. Privacy loss in Apple’s implementation of differential privacy on MacOS 10.12. arXiv preprint arXiv:1709.02753, 2017. URL:
  60. Jonathan Ullman. Tight lower bounds for locally differentially private selection. Technical Report abs/1802.02638, arXiv, 2018. URL:
  61. Salil Vadhan. The Complexity of Differential Privacy. Unpublished manuscript, 2016. URL:
  62. L. G. Valiant. A theory of the learnable. Commun. ACM, 27(11):1134-1142, November 1984. URL:
  63. Giridhari Venkatadri, Elena Lucherini, Piotr Sapiezyński, and Alan Mislove. Investigating sources of PII used in Facebook’s targeted advertising. In Proceedings of the Privacy Enhancing Technologies Symposium (PETS'19), July 2019. Google Scholar
  64. Rui Wang, Yong Fuga Li, XiaoFeng Wang, Haixu Tang, and Xiaoyong Zhou. Learning your identity and disease from research papers: information leaks in genome wide association study. In Proceedings of the 16th ACM Conference on Computer and Communications Security, pages 534-544. ACM, 2009. Google Scholar
  65. Stanley L. Warner. Randomized response: A survey technique for eliminating evasive answer bias. Journal of the American Statistical Association, 60(309):63-69, 1965. Google Scholar
  66. Fei Yu and Zhanglong Ji. Scalable privacy-preserving data sharing methodology for genome-wide association studies: an application to idash healthcare privacy protection challenge. BMC medical informatics and decision making, 14(1):S3, 2014. Google Scholar