Bias Reduction for Sum Estimation

Authors Talya Eden , Jakob Bæk Tejs Houen, Shyam Narayanan, Will Rosenbaum , Jakub Tětek



PDF
Thumbnail PDF

File

LIPIcs.APPROX-RANDOM.2023.62.pdf
  • Filesize: 0.81 MB
  • 21 pages

Document Identifiers

Author Details

Talya Eden
  • Bar-Ilan University, Ramat Gan, Israel
Jakob Bæk Tejs Houen
  • BARC, University of Copenhagen, Denmark
Shyam Narayanan
  • MIT, Cambridge, MA, USA
Will Rosenbaum
  • Amherst College, MA, USA
Jakub Tětek
  • BARC, University of Copenhagen, Denmark

Acknowledgements

We thank the anonymous peer reviewers, whose feedback helped improve our manuscript.

Cite As Get BibTex

Talya Eden, Jakob Bæk Tejs Houen, Shyam Narayanan, Will Rosenbaum, and Jakub Tětek. Bias Reduction for Sum Estimation. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 275, pp. 62:1-62:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023) https://doi.org/10.4230/LIPIcs.APPROX/RANDOM.2023.62

Abstract

In classical statistics and distribution testing, it is often assumed that elements can be sampled exactly from some distribution 𝒫, and that when an element x is sampled, the probability 𝒫(x) of sampling x is also known. In this setting, recent work in distribution testing has shown that many algorithms are robust in the sense that they still produce correct output if the elements are drawn from any distribution 𝒬 that is sufficiently close to 𝒫. This phenomenon raises interesting questions: under what conditions is a "noisy" distribution 𝒬 sufficient, and what is the algorithmic cost of coping with this noise?
In this paper, we investigate these questions for the problem of estimating the sum of a multiset of N real values x_1, …, x_N. This problem is well-studied in the statistical literature in the case 𝒫 = 𝒬, where the Hansen-Hurwitz estimator [Annals of Mathematical Statistics, 1943] is frequently used. We assume that for some (known) distribution 𝒫, values are sampled from a distribution 𝒬 that is pointwise close to 𝒫. That is, there is a parameter γ < 1 such that for all x_i, (1 - γ) 𝒫(i) ≤ 𝒬(i) ≤ (1 + γ) 𝒫(i). For every positive integer k we define an estimator ζ_k for μ = ∑_i x_i whose bias is proportional to γ^k (where our ζ₁ reduces to the classical Hansen-Hurwitz estimator). As a special case, we show that if 𝒬 is pointwise γ-close to uniform and all x_i ∈ {0, 1}, for any ε > 0, we can estimate μ to within additive error ε N using m = Θ(N^{1-1/k}/ε^{2/k}) samples, where k = ⌈lg ε/lg γ⌉. We then show that this sample complexity is essentially optimal. Interestingly, our upper and lower bounds show that the sample complexity need not vary uniformly with the desired error parameter ε: for some values of ε, perturbations in its value have no asymptotic effect on the sample complexity, while for other values, any decrease in its value results in an asymptotically larger sample complexity.

Subject Classification

ACM Subject Classification
  • Mathematics of computing → Probabilistic algorithms
  • Theory of computation → Sample complexity and generalization bounds
  • Theory of computation → Streaming, sublinear and near linear time algorithms
  • Theory of computation → Lower bounds and information complexity
Keywords
  • bias reduction
  • sum estimation
  • sublinear time algorithms
  • sample complexity

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Petra Berenbrink, Bruce Krayenhoff, and Frederik Mallmann-Trenn. Estimating the number of connected components in sublinear time. Information Processing Letters, 114(11):639-642, 2014. Google Scholar
  2. Lorenzo Beretta and Jakub Tětek. Better sum estimation via weighted sampling. In Proceedings of the 2022 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), pages 2303-2338. SIAM, 2022. Google Scholar
  3. Clément Canonne and Ronitt Rubinfeld. Testing probability distributions underlying aggregated data. In International Colloquium on Automata, Languages, and Programming, pages 283-295. Springer, 2014. Google Scholar
  4. Clément L. Canonne, Themis Gouleakis, and Ronitt Rubinfeld. Sampling correctors. SIAM J. Comput., 47(4):1373-1423, 2018. URL: https://doi.org/10.1137/16M1076666.
  5. Edith Cohen, Nick Duffield, Haim Kaplan, Carstent Lund, and Mikkel Thorup. Algorithms and estimators for summarization of unaggregated data streams. Journal of Computer and System Sciences, 80(7):1214-1244, 2014. Google Scholar
  6. Talya Eden, Piotr Indyk, Shyam Narayanan, Ronitt Rubinfeld, Sandeep Silwal, and Tal Wagner. Learning-based support estimation in sublinear time. In 9th International Conference on Learning Representations, ICLR 2021, Virtual Event, Austria, May 3-7, 2021. OpenReview.net, 2021. URL: https://openreview.net/forum?id=tilovEHA3YS.
  7. Talya Eden, Saleet Mossel, and Ronitt Rubinfeld. Sampling multiple edges efficiently. In Mary Wootters and Laura Sanità, editors, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques, APPROX/RANDOM 2021, August 16-18, 2021, University of Washington, Seattle, Washington, USA (Virtual Conference), volume 207 of LIPIcs, pages 51:1-51:15. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2021. URL: https://doi.org/10.4230/LIPIcs.APPROX/RANDOM.2021.51.
  8. Talya Eden, Dana Ron, and Will Rosenbaum. The arboricity captures the complexity of sampling edges. In Christel Baier, Ioannis Chatzigiannakis, Paola Flocchini, and Stefano Leonardi, editors, 46th International Colloquium on Automata, Languages, and Programming, ICALP 2019, July 9-12, 2019, Patras, Greece, volume 132 of LIPIcs, pages 52:1-52:14. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2019. URL: https://doi.org/10.4230/LIPIcs.ICALP.2019.52.
  9. Talya Eden, Dana Ron, and Will Rosenbaum. Almost optimal bounds for sublinear-time sampling of k-cliques in bounded arboricity graphs. In Mikolaj Bojanczyk, Emanuela Merelli, and David P. Woodruff, editors, 49th International Colloquium on Automata, Languages, and Programming, ICALP 2022, July 4-8, 2022, Paris, France, volume 229 of LIPIcs, pages 56:1-56:19. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2022. URL: https://doi.org/10.4230/LIPIcs.ICALP.2022.56.
  10. Talya Eden and Will Rosenbaum. On sampling edges almost uniformly. In Raimund Seidel, editor, 1st Symposium on Simplicity in Algorithms, SOSA 2018, January 7-10, 2018, New Orleans, LA, USA, volume 61 of OASIcs, pages 7:1-7:9. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2018. URL: https://doi.org/10.4230/OASIcs.SOSA.2018.7.
  11. Oded Goldreich. Introduction to Property Testing. Cambridge University Press, 2017. Google Scholar
  12. Morris H Hansen and William N Hurwitz. On the theory of sampling from finite populations. The Annals of Mathematical Statistics, 14(4):333-362, 1943. Google Scholar
  13. Jonathan Hermon. On sensitivity of uniform mixing times. In Annales de l'Institut Henri Poincaré, Probabilités et Statistiques, volume 54, pages 234-248. Institut Henri Poincaré, 2018. Google Scholar
  14. Jacob Holm and Jakub Tětek. Massively parallel computation and sublinear-time algorithms for embedded planar graphs. arXiv preprint, 2022. URL: https://arxiv.org/abs/2204.09035.
  15. D. G. Horvitz and D. J. Thompson. A generalization of sampling without replacement from a finite universe. Journal of the American Statistical Association, 47(260):663-685, 1952. URL: https://doi.org/10.1080/01621459.1952.10483446.
  16. Ben Morris and Yuval Peres. Evolving sets and mixin. In Lawrence L. Larmore and Michel X. Goemans, editors, Proceedings of the 35th Annual ACM Symposium on Theory of Computing, June 9-11, 2003, San Diego, CA, USA, pages 279-286. ACM, 2003. URL: https://doi.org/10.1145/780542.780585.
  17. Rajeev Motwani, Rina Panigrahy, and Ying Xu. Estimating sum by weighted sampling. In International Colloquium on Automata, Languages, and Programming, pages 53-64. Springer, 2007. Google Scholar
  18. Krzysztof Onak and Xiaorui Sun. Probability-revealing samples. In Amos J. Storkey and Fernando Pérez-Cruz, editors, International Conference on Artificial Intelligence and Statistics, AISTATS 2018, 9-11 April 2018, Playa Blanca, Lanzarote, Canary Islands, Spain, volume 84 of Proceedings of Machine Learning Research, pages 2018-2026. PMLR, 2018. URL: http://proceedings.mlr.press/v84/onak18a.html.
  19. Sofya Raskhodnikova, Dana Ron, Amir Shpilka, and Adam Smith. Strong lower bounds for approximating distribution support size and the distinct elements problem. SIAM Journal on Computing, 39(3):813-842, 2009. Google Scholar
  20. Jakub Tětek and Mikkel Thorup. Edge sampling and graph parameter estimation via vertex neighborhood accesses. In Stefano Leonardi and Anupam Gupta, editors, STOC '22: 54th Annual ACM SIGACT Symposium on Theory of Computing, Rome, Italy, June 20-24, 2022, pages 1116-1129. ACM, 2022. URL: https://doi.org/10.1145/3519935.3520059.
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail