,
Abigail Gentle
,
Vikrant Singhal
Creative Commons Attribution 4.0 International license
We initiate the study of distribution testing under user-level local differential privacy, where each of n users contributes m samples from the unknown underlying distribution. This setting, albeit very natural, is significantly more challenging than the usual locally private setting, as for the same parameter ε the privacy guarantee must now apply to a full batch of m data points. While some recent work considers distribution learning in this user-level setting, nothing was known for even the most fundamental testing task, uniformity testing (and its generalization, identity testing). We address this gap, by providing (nearly) sample-optimal user-level LDP algorithms for uniformity and identity testing. Motivated by practical considerations, our main focus is on the private-coin, symmetric setting, which does not require users to share a common random seed nor to have been assigned a globally unique identifier.
@InProceedings{canonne_et_al:LIPIcs.ITCS.2026.33,
author = {Canonne, Cl\'{e}ment L. and Gentle, Abigail and Singhal, Vikrant},
title = {{Uniformity Testing Under User-Level Local Privacy}},
booktitle = {17th Innovations in Theoretical Computer Science Conference (ITCS 2026)},
pages = {33:1--33:24},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
ISBN = {978-3-95977-410-9},
ISSN = {1868-8969},
year = {2026},
volume = {362},
editor = {Saraf, Shubhangi},
publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
address = {Dagstuhl, Germany},
URL = {https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2026.33},
URN = {urn:nbn:de:0030-drops-253201},
doi = {10.4230/LIPIcs.ITCS.2026.33},
annote = {Keywords: Differential Privacy, Local Differential Privacy, Uniformity Testing, Identity Testing, Hypothesis Testing, User-Level Differential Privacy, Person-Level Differential Privacy}
}