Hyperbolic Concentration, Anti-Concentration, and Discrepancy

Authors Zhao Song, Ruizhe Zhang



PDF
Thumbnail PDF

File

LIPIcs.APPROX-RANDOM.2022.10.pdf
  • Filesize: 0.79 MB
  • 19 pages

Document Identifiers

Author Details

Zhao Song
  • Adobe Research, Seattle, WA, USA
Ruizhe Zhang
  • The University of Texas at Austin, TX, USA

Acknowledgements

We thank the anonymous reviewers for helpful comments. The authors would like to thank Petter Brändén and James Renegar for many useful discussions about the literature of hyperbolic polynomials. The authors would like to thank Yin Tat Lee and James Renegar, Scott Aaronson for encouraging us to work on this topic. The authors would like to thank Dana Moshkovitz for giving comments on the draft.

Cite As Get BibTex

Zhao Song and Ruizhe Zhang. Hyperbolic Concentration, Anti-Concentration, and Discrepancy. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 245, pp. 10:1-10:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022) https://doi.org/10.4230/LIPIcs.APPROX/RANDOM.2022.10

Abstract

Chernoff bound is a fundamental tool in theoretical computer science. It has been extensively used in randomized algorithm design and stochastic type analysis. Discrepancy theory, which deals with finding a bi-coloring of a set system such that the coloring of each set is balanced, has a huge number of applications in approximation algorithms design. Chernoff bound [Che52] implies that a random bi-coloring of any set system with n sets and n elements will have discrepancy O(√{n log n}) with high probability, while the famous result by Spencer [Spe85] shows that there exists an O(√n) discrepancy solution. 
The study of hyperbolic polynomials dates back to the early 20th century when used to solve PDEs by Gårding [Går59]. In recent years, more applications are found in control theory, optimization, real algebraic geometry, and so on. In particular, the breakthrough result by Marcus, Spielman, and Srivastava [MSS15] uses the theory of hyperbolic polynomials to prove the Kadison-Singer conjecture [KS59], which is closely related to discrepancy theory. 
In this paper, we present a list of new results for hyperbolic polynomials:  
- We show two nearly optimal hyperbolic Chernoff bounds: one for Rademacher sum of arbitrary vectors and another for random vectors in the hyperbolic cone. 
- We show a hyperbolic anti-concentration bound. 
- We generalize the hyperbolic Kadison-Singer theorem [Brä18] for vectors in sub-isotropic position, and prove a hyperbolic Spencer theorem for any constant hyperbolic rank vectors. 
The classical matrix Chernoff and discrepancy results are based on determinant polynomial which is a special case of hyperbolic polynomials. To the best of our knowledge, this paper is the first work that shows either concentration or anti-concentration results for hyperbolic polynomials. We hope our findings provide more insights into hyperbolic and discrepancy theories.

Subject Classification

ACM Subject Classification
  • Theory of computation → Randomness, geometry and discrete structures
Keywords
  • Hyperbolic polynomial
  • Chernoff bound
  • Concentration
  • Discrepancy theory
  • Anti-concentration

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Emmanuel Abbe, Amir Shpilka, and Avi Wigderson. Reed-muller codes for random erasures and errors. IEEE Transactions on Information Theory, 61(10):5229-5252, 2015. Google Scholar
  2. Radosław Adamczak, Rafał Latała, and Rafał Meller. Moments of gaussian chaoses in banach spaces. Electronic Journal of Probability, 26:1-36, 2021. Google Scholar
  3. Radosław Adamczak and Paweł Wolff. Concentration inequalities for non-lipschitz functions with bounded derivatives of higher order. Probability Theory and Related Fields, 162(3):531-586, 2015. Google Scholar
  4. Radosław Adamczak and Rafał Latała. Tail and moment estimates for chaoses generated by symmetric random variables with logarithmically concave tails. Annales de l'Institut Henri Poincaré, Probabilités et Statistiques, 48(4):1103-1136, 2012. URL: https://doi.org/10.1214/11-AIHP441.
  5. Rudolf Ahlswede and Andreas Winter. Strong converse for identification via quantum channels. IEEE Transactions on Information Theory, 48(3):569-579, 2002. Google Scholar
  6. Kasra Alishahi and Milad Barzegar. Paving property for real stable polynomials and strongly rayleigh processes. arXiv preprint, 2020. URL: http://arxiv.org/abs/2006.13923.
  7. Nima Amini. Spectrahedrality of hyperbolicity cones of multivariate matching polynomials. Journal of Algebraic Combinatorics, 50(2):165-190, 2019. Google Scholar
  8. Nima Anari and Shayan Oveis Gharan. The kadison-singer problem for strongly rayleigh measures and applications to asymmetric tsp. arXiv preprint, 2014. URL: http://arxiv.org/abs/1412.1143.
  9. Nima Anari and Shayan Oveis Gharan. Effective-resistance-reducing flows, spectrally thin trees, and asymmetric tsp. In 2015 IEEE 56th Annual Symposium on Foundations of Computer Science (FOCS), pages 20-39. IEEE, 2015. Google Scholar
  10. Richard Aoun, Marwa Banna, and Pierre Youssef. Matrix Poincaré inequalities and concentration, 2020. In Advances in Mathematics, volume 371. URL: http://arxiv.org/abs/1910.13797.
  11. Srinivasan Arunachalam and Penghui Yao. Positive spectrahedrons: Geometric properties, invariance principles and pseudorandom generators. arXiv preprint, 2021. URL: http://arxiv.org/abs/2101.08141.
  12. Nikhil Bansal. Constructive algorithms for discrepancy minimization. In 2010 IEEE 51st Annual Symposium on Foundations of Computer Science, pages 3-10. IEEE, 2010. Google Scholar
  13. Nikhil Bansal, Daniel Dadush, Shashwat Garg, and Shachar Lovett. The gram-schmidt walk: a cure for the banaszczyk blues. In Proceedings of the 50th Annual ACM SIGACT Symposium on Theory of Computing, pages 587-597, 2018. Google Scholar
  14. Heinz H Bauschke, Osman Güler, Adrian S Lewis, and Hristo S Sendov. Hyperbolic polynomials and convex analysis. Canadian Journal of Mathematics, 53(3):470-488, 2001. Google Scholar
  15. Sergei Bernstein. On a modification of chebyshev’s inequality and of the error formula of laplace. Ann. Sci. Inst. Sav. Ukraine, Sect. Math, 1(4):38-49, 1924. Google Scholar
  16. Aditya Bhaskara, Moses Charikar, Ankur Moitra, and Aravindan Vijayaraghavan. Smoothed analysis of tensor decompositions. In Proceedings of the forty-sixth annual ACM symposium on Theory of computing, pages 594-603, 2014. Google Scholar
  17. Petter Brändén. Hyperbolicity cones of elementary symmetric polynomials are spectrahedral. Optimization Letters, 8(5):1773-1782, 2014. Google Scholar
  18. Petter Brändén. Hyperbolic polynomials and the Kadison-Singer problem. arXiv preprint, 2018. URL: http://arxiv.org/abs/1809.03255.
  19. Peter G Casazza and Janet C Tremain. Consequences of the Marcus/Spielman/Srivastava solution of the Kadison-Singer problem. In New Trends in Applied Harmonic Analysis, pages 191-213. Springer, 2016. Google Scholar
  20. Herman Chernoff. A measure of asymptotic efficiency for tests of a hypothesis based on the sum of observations. The Annals of Mathematical Statistics, pages 493-507, 1952. Google Scholar
  21. Michael Cohen. Improved spectral sparsification and Kadison-Singer for sums of higher-rank matrices. In Banff International Research Station for Mathematical Innovation and Discovery. https://open.library.ubc.ca/cIRcle/collections/48630/items/1.0340957, 2016.
  22. Kevin P Costello, Terence Tao, and Van Vu. Random symmetric matrices are almost surely nonsingular. Duke Mathematical Journal, 135(2):395-413, 2006. Google Scholar
  23. Daniel Dadush, Haotian Jiang, and Victor Reis. A new framework for matrix discrepancy: Partial coloring bounds via mirror descent. arXiv preprint, 2021. URL: http://arxiv.org/abs/2111.03171.
  24. Daniel Dadush, Aleksandar Nikolov, Kunal Talwar, and Nicole Tomczak-Jaegermann. Balancing vectors in any norm. In 2018 IEEE 59th Annual Symposium on Foundations of Computer Science (FOCS), pages 1-10. IEEE, 2018. Google Scholar
  25. Ronen Eldan and Mohit Singh. Efficient algorithms for discrepancy minimization in convex sets. Random Structures & Algorithms, 53(2):289-307, 2018. Google Scholar
  26. Paul Erdös. On a lemma of littlewood and offord. Bulletin of the American Mathematical Society, 51(12):898-902, 1945. Google Scholar
  27. Lars Gårding. Linear hyperbolic partial differential equations with constant coefficients. Acta Mathematica, 85:1-62, 1951. Google Scholar
  28. Lars Gårding. An inequality for hyperbolic polynomials. Journal of Mathematics and Mechanics, pages 957-965, 1959. Google Scholar
  29. Ankit Garg, Yin-Tat Lee, Zhao Song, and Nikhil Srivastava. A matrix expander chernoff bound, 2018. In STOC. URL: http://arxiv.org/abs/1704.03864.
  30. Shayan Oveis Gharan. Proof of kadison-singer conjecture and the extensions, 2015. Google Scholar
  31. Sidney Golden. Lower bounds for the helmholtz function. Physical Review, 137(4B):B1127, 1965. Google Scholar
  32. Osman Güler. Hyperbolic polynomials and interior point methods for convex programming. Mathematics of Operations Research, 22(2):350-377, 1997. Google Scholar
  33. Leonid Gurvits. Combinatorics hidden in hyperbolic polynomials and related topics. arXiv preprint, 2004. URL: http://arxiv.org/abs/math/0402088.
  34. Leonid Gurvits. Hyperbolic polynomials approach to van der waerden/schrijver-valiant like conjectures: sharper bounds, simpler proofs and algorithmic applications. In Proceedings of the thirty-eighth annual ACM symposium on Theory of computing, pages 417-426, 2006. Google Scholar
  35. Leonid Gurvits. Van der waerden/schrijver-valiant like conjectures and stable (aka hyperbolic) homogeneous polynomials: one theorem for all. arXiv preprint, 2007. URL: http://arxiv.org/abs/0711.3496.
  36. J William Helton and Victor Vinnikov. Linear matrix inequality representation of sets. Communications on Pure and Applied Mathematics: A Journal Issued by the Courant Institute of Mathematical Sciences, 60(5):654-674, 2007. Google Scholar
  37. Wassily Hoeffding. Probability inequalities for sums of bounded random variables. In The collected works of Wassily Hoeffding, pages 409-426. Springer, 1994. Google Scholar
  38. Samuel B Hopkins, Prasad Raghavendra, and Abhishek Shetty. Matrix discrepancy from quantum communication. arXiv preprint, 2021. URL: http://arxiv.org/abs/2110.10099.
  39. L Hormander. The analysis of linear partial differential operators ii. Grundlehren, 257, 1983. Google Scholar
  40. He Jia, Aditi Laddha, Yin Tat Lee, and Santosh S Vempala. Reducing isotropy and volume to KLS: An O^*(n³ψ²) volume algorithm. arXiv preprint, 2020. URL: http://arxiv.org/abs/2008.02146.
  41. Richard V Kadison and Isadore M Singer. Extensions of pure states. American journal of mathematics, 81(2):383-400, 1959. Google Scholar
  42. N.V. Krylov. On the general notion of fully nonlinear second-order elliptic equations. Transactions of the American Mathematical Society, 347(3):857-895, 1995. Google Scholar
  43. Mario Kummer, Daniel Plaumann, and Cynthia Vinzant. Hyperbolic polynomials, interlacers, and sums of squares. Mathematical Programming, 153(1):223-245, 2015. Google Scholar
  44. Rasmus Kyng, Kyle Luh, and Zhao Song. Four deviations suffice for rank 1 matrices, 2020. In Advances in Mathematics. URL: http://arxiv.org/abs/1901.06731.
  45. Rasmus Kyng and Zhao Song. A matrix chernoff bound for strongly rayleigh distributions and spectral sparsifiers from a few random spanning trees, 2018. In FOCS. URL: http://arxiv.org/abs/1810.08345.
  46. Rafał Latała. Estimates of moments and tails of gaussian chaoses. The Annals of Probability, 34(6):2315-2331, 2006. Google Scholar
  47. Lap Chi Lau and Hong Zhou. A spectral approach to network design. In Proceedings of the 52nd Annual ACM SIGACT Symposium on Theory of Computing (STOC), pages 826-839, 2020. Google Scholar
  48. Peter D Lax. Differential equations, difference equations and matrix theory. Technical report, New York Univ., New York. Atomic Energy Commission Computing and Applied, 1957. Google Scholar
  49. Michel Ledoux and Michel Talagrand. Probability in Banach Spaces: isoperimetry and processes. Springer Science & Business Media, 2013. Google Scholar
  50. Joseph Lehec. Moments of the gaussian chaos. In Séminaire de Probabilités XLIII, pages 327-340. Springer, 2011. Google Scholar
  51. Avi Levy, Harishchandra Ramadas, and Thomas Rothvoss. Deterministic discrepancy minimization via the multiplicative weight update method. In International Conference on Integer Programming and Combinatorial Optimization, pages 380-391. Springer, 2017. Google Scholar
  52. Adrian Lewis, Pablo Parrilo, and Motakuri Ramana. The lax conjecture is true. Proceedings of the American Mathematical Society, 133(9):2495-2499, 2005. Google Scholar
  53. John Edensor Littlewood and Albert Cyril Offord. On the number of real roots of a random algebraic equation (iii). Rec. Math. [Mat. Sbornik] N.S., 12(3):277-286, 1943. Google Scholar
  54. Shachar Lovett and Raghu Meka. Constructive discrepancy minimization by walking on the edges. SIAM Journal on Computing, 44(5):1573-1582, 2015. Google Scholar
  55. Lester Mackey, Michael I Jordan, Richard Y Chen, Brendan Farrell, and Joel A Tropp. Matrix concentration inequalities via the method of exchangeable pairs. The Annals of Probability, 42(3):906-945, 2014. Google Scholar
  56. Adam W Marcus, Daniel A Spielman, and Nikhil Srivastava. Interlacing families II: Mixed characteristic polynomials and the Kadison-Singer problem, 2015. URL: http://arxiv.org/abs/1306.3969.
  57. Adam W Marcus, Daniel A Spielman, and Nikhil Srivastava. Interlacing families IV: Bipartite ramanujan graphs of all sizes. SIAM Journal on Computing, 47(6):2488-2509, 2018. Google Scholar
  58. Adam W Marcus and Nikhil Srivastava. The solution of the Kadison-Singer problem, 2016. In Current Developments in Mathematics. URL: http://arxiv.org/abs/1712.08874.
  59. Jiri Matousek. Geometric discrepancy: An illustrated guide, volume 18. Springer Science & Business Media, 2009. Google Scholar
  60. Raghu Meka. Discrepancy and beating the union bound. In Windows on theory, a research blog. https://windowsontheory.org/2014/02/07/discrepancy-and-beating-the-union-bound/, 2014.
  61. Raghu Meka, Oanh Nguyen, and Van Vu. Anti-concentration for polynomials of independent random variables. In Theory Of Computing. arXiv preprint, 2017. URL: http://arxiv.org/abs/1507.00829.
  62. Raghu Meka and David Zuckerman. Pseudorandom generators for polynomial threshold functions. SIAM Journal on Computing, 42(3):1275-1301, 2013. Google Scholar
  63. Stanislav Minsker. On some extensions of bernstein’s inequality for self-adjoint operators. Statistics & Probability Letters, 127:111-119, 2017. Google Scholar
  64. Tor Myklebust and Levent Tunçel. Interior-point algorithms for convex optimization based on primal-dual metrics. arXiv preprint, 2014. URL: http://arxiv.org/abs/1411.2129.
  65. Simone Naldi and Daniel Plaumann. Symbolic computation in hyperbolic programming. Journal of Algebra and Its Applications, 17(10):1850192, 2018. Google Scholar
  66. Assaf Naor, Shravas Rao, and Oded Regev. Concentration of markov chains with bounded moments. In Annales de l'Institut Henri Poincaré, Probabilités et Statistiques, volume 56, pages 2270-2280. Institut Henri Poincaré, 2020. Google Scholar
  67. Ryan O'Donnell, Rocco A Servedio, and Li-Yang Tan. Fooling polytopes. In Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing, pages 614-625, 2019. Google Scholar
  68. Roberto Imbuzeiro Oliveira. Concentration of the adjacency matrix and of the laplacian in random graphs with independent edges. arXiv preprint, 2009. URL: http://arxiv.org/abs/0911.0600.
  69. Prasad Raghavendra, Nick Ryder, Nikhil Srivastava, and Benjamin Weitz. Exponential lower bounds on spectrahedral representations of hyperbolicity cones. In Proceedings of the Thirtieth Annual ACM-SIAM Symposium on Discrete Algorithms, pages 2322-2332. SIAM, 2019. Google Scholar
  70. Victor Reis and Thomas Rothvoss. Vector balancing in lebesgue spaces. arXiv preprint, 2020. URL: http://arxiv.org/abs/2007.05634.
  71. James Renegar. Hyperbolic programs, and their derivative relaxations. Foundations of Computational Mathematics, 6(1):59-79, 2006. Google Scholar
  72. James Renegar. "Efficient” subgradient methods for general convex optimization. SIAM Journal on Optimization, 26(4):2649-2676, 2016. Google Scholar
  73. James Renegar. Accelerated first-order methods for hyperbolic programming. Mathematical Programming, 173(1-2):1-35, 2019. Google Scholar
  74. James Renegar. Personal communication, 2019. Google Scholar
  75. James Renegar and Mutiara Sondjaja. A polynomial-time affine-scaling method for semidefinite and hyperbolic programming. arXiv preprint, 2014. URL: http://arxiv.org/abs/1410.6734.
  76. Thomas Rothvoss. Constructive discrepancy minimization for convex sets. SIAM Journal on Computing, 46(1):224-234, 2017. Google Scholar
  77. Mark Rudelson. Random vectors in the isotropic position. Journal of Functional Analysis, 164(1):60-72, 1999. Google Scholar
  78. Mark Rudelson and Roman Vershynin. Sampling from large matrices: An approach through geometric functional analysis. Journal of the ACM (JACM), 54(4), 2007. Google Scholar
  79. Mark Rudelson and Roman Vershynin. Hanson-wright inequality and sub-gaussian concentration. Electronic Communications in Probability, 18, 2013. Google Scholar
  80. James Saunderson. A spectrahedral representation of the first derivative relaxation of the positive semidefinite cone. Optimization Letters, 12(7):1475-1486, 2018. Google Scholar
  81. James Saunderson. Certifying polynomial nonnegativity via hyperbolic optimization. SIAM Journal on Applied Algebra and Geometry, 3(4):661-690, 2019. Google Scholar
  82. Zhao Song and Ruizhe Zhang. Hyperbolic concentration, anti-concentration, and discrepancy, 2020. URL: http://arxiv.org/abs/2008.09593.
  83. Joel Spencer. Six standard deviations suffice. Transactions of the American mathematical society, 289(2):679-706, 1985. Google Scholar
  84. Colin J Thompson. Inequality with applications in statistical mechanics. Journal of Mathematical Physics, 6(11):1812-1813, 1965. Google Scholar
  85. Joel A Tropp. User-friendly tail bounds for sums of random matrices. Foundations of computational mathematics, 12(4):389-434, 2012. Google Scholar
  86. Joel A Tropp. An introduction to matrix concentration inequalities. Foundations and Trends in Machine Learning, 8(1-2):1-230, 2015. Google Scholar
  87. Joel A Tropp. Second-order matrix concentration inequalities. Applied and Computational Harmonic Analysis, 44(3):700-736, 2018. Google Scholar
  88. Roman Vershynin. Concentration inequalities for random tensors. Bernoulli, 26(4):3139-3162, 2020. Google Scholar
  89. Ruizhe Zhang and Xinzhi Zhang. A real stable generalization of Anari, Oveis Gharan and Kyng, Luh, Song. manuscript, 2021. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail