Renyi Entropy Estimation Revisited

Authors Maciej Obremski, Maciej Skorski



PDF
Thumbnail PDF

File

LIPIcs.APPROX-RANDOM.2017.20.pdf
  • Filesize: 0.57 MB
  • 15 pages

Document Identifiers

Author Details

Maciej Obremski
Maciej Skorski

Cite As Get BibTex

Maciej Obremski and Maciej Skorski. Renyi Entropy Estimation Revisited. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 81, pp. 20:1-20:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017) https://doi.org/10.4230/LIPIcs.APPROX-RANDOM.2017.20

Abstract

We revisit the problem of estimating entropy of discrete distributions from independent samples, studied recently by Acharya, Orlitsky, Suresh and Tyagi (SODA 2015), improving their upper and lower bounds on the necessary sample size n. For estimating Renyi entropy of order alpha, up to constant accuracy and error probability, we show the following

* Upper bounds n = O(1) 2^{(1-1/alpha)H_alpha} for integer alpha>1, as the worst case over distributions with Renyi entropy equal to H_alpha.

* Lower bounds n = Omega(1) K^{1-1/alpha} for any real alpha>1, with the constant being an inverse polynomial of the accuracy, as the worst case over all distributions on K elements. 

Our upper bounds essentially replace the alphabet size by a factor exponential in the entropy, which offers improvements especially in low or medium entropy regimes (interesting for example in anomaly detection). As for the lower bounds, our proof explicitly shows how the complexity depends on both alphabet and accuracy, partially solving the open problem posted in previous works. The argument for upper bounds derives a clean identity for the variance of falling-power sum of a multinomial distribution. Our approach for lower bounds utilizes convex optimization to find a distribution with possibly worse estimation performance, and may be of independent interest as a tool to work with Le Cam’s two point method.

Subject Classification

Keywords
  • Renyi entropy
  • entropy estimation
  • sample complexity
  • convex optimization

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Jayadev Acharya, Alon Orlitsky, Ananda Theertha Suresh, and Himanshu Tyagi. The complexity of estimating rényi entropy. In Proceedings of the Twenty-Sixth Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2015, San Diego, CA, USA, January 4-6, 2015, pages 1855-1869, 2015. URL: http://dx.doi.org/10.1137/1.9781611973730.124.
  2. Jayadev Acharya, Alon Orlitsky, Ananda Theertha Suresh, and Himanshu Tyagi. Estimating renyi entropy of discrete distributions. IEEE Trans. Information Theory, 63(1):38-56, 2017. URL: http://dx.doi.org/10.1109/TIT.2016.2620435.
  3. Erdal Arikan. An inequality on guessing and its application to sequential decoding. IEEE Trans. Information Theory, 42(1):99-105, 1996. URL: http://dx.doi.org/10.1109/18.481781.
  4. Boaz Barak, Yevgeniy Dodis, Hugo Krawczyk, Olivier Pereira, Krzysztof Pietrzak, François-Xavier Standaert, and Yu Yu. Leftover hash lemma, revisited. In Advances in Cryptology - CRYPTO 2011 - 31st Annual Cryptology Conference, Santa Barbara, CA, USA, August 14-18, 2011. Proceedings, pages 1-20, 2011. URL: http://dx.doi.org/10.1007/978-3-642-22792-9_1.
  5. Tugkan Batu, Sanjoy Dasgupta, Ravi Kumar, and Ronitt Rubinfeld. The complexity of approximating entropy. In Proceedings on 34th Annual ACM Symposium on Theory of Computing, May 19-21, 2002, Montréal, Québec, Canada, pages 678-687, 2002. URL: http://dx.doi.org/10.1145/509907.510005.
  6. Tugkan Batu, Lance Fortnow, Ronitt Rubinfeld, Warren D. Smith, and Patrick White. Testing closeness of discrete distributions. J. ACM, 60(1):4:1-4:25, 2013. URL: http://dx.doi.org/10.1145/2432622.2432626.
  7. Charles H. Bennett, Gilles Brassard, Claude Crépeau, and Ueli M. Maurer. Generalized privacy amplification. IEEE Trans. Information Theory, 41(6):1915-1923, 1995. URL: http://dx.doi.org/10.1109/18.476316.
  8. Yevgeniy Dodis and Yu Yu. Overcoming weak expectations. In Theory of Cryptography - 10th Theory of Cryptography Conference, TCC 2013, Tokyo, Japan, March 3-6, 2013. Proceedings, pages 1-22, 2013. URL: http://dx.doi.org/10.1007/978-3-642-36594-2_1.
  9. Oded Goldreich and Dana Ron. On testing expansion in bounded-degree graphs. In Studies in Complexity and Cryptography. Miscellanea on the Interplay between Randomness and Computation - In Collaboration with Lidor Avigad, Mihir Bellare, Zvika Brakerski, Shafi Goldwasser, Shai Halevi, Tali Kaufman, Leonid Levin, Noam Nisan, Dana Ron, Madhu Sudan, Luca Trevisan, Salil Vadhan, Avi Wigderson, David Zuckerman, pages 68-75. 2011. URL: http://dx.doi.org/10.1007/978-3-642-22670-0_9.
  10. Manjesh Kumar Hanawal and Rajesh Sundaresan. Guessing revisited: A large deviations approach. IEEE Trans. Information Theory, 57(1):70-78, 2011. URL: http://dx.doi.org/10.1109/TIT.2010.2090221.
  11. Russell Impagliazzo and David Zuckerman. How to recycle random bits. In 30th Annual Symposium on Foundations of Computer Science, Research Triangle Park, North Carolina, USA, 30 October - 1 November 1989, pages 248-253, 1989. URL: http://dx.doi.org/10.1109/SFCS.1989.63486.
  12. R. Jenssen, K. E. Hild, D. Erdogmus, J. C. Principe, and T. Eltoft. Clustering using renyi’s entropy. In Proceedings of the International Joint Conference on Neural Networks, 2003., volume 1, pages 523-528 vol.1, July 2003. URL: http://dx.doi.org/10.1109/IJCNN.2003.1223401.
  13. Petr Jizba, Hagen Kleinert, and Mohammad Shefaat. Rényi’s information transfer between financial time series. Physica A: Statistical Mechanics and its Applications, 391(10):2971-2989, 2012. URL: http://dx.doi.org/10.1016/j.physa.2011.12.064.
  14. Donald E. Knuth. The Art of Computer Programming, Volume 3: (2nd Ed.) Sorting and Searching. Addison Wesley Longman Publishing Co., Inc., Redwood City, CA, USA, 1998. Google Scholar
  15. Ke Li, Wanlei Zhou, Shui Yu, and Bo Dai. Effective ddos attacks detection using generalized entropy metric. In Algorithms and Architectures for Parallel Processing, 9th International Conference, ICA3PP 2009, Taipei, Taiwan, June 8-11, 2009. Proceedings, pages 266-280, 2009. URL: http://dx.doi.org/10.1007/978-3-642-03095-6_27.
  16. Bing Ma, Alfred O. Hero III, John D. Gorman, and Olivier J. J. Michel. Image registration with minimum spanning tree algorithm. In Proceedings of the 2000 International Conference on Image Processing, ICIP 2000, Vancouver, BC, Canada, September 10-13, 2000, pages 481-484, 2000. URL: http://dx.doi.org/10.1109/ICIP.2000.901000.
  17. Yishay Mansour, Mehryar Mohri, and Afshin Rostamizadeh. Multiple source adaptation and the rényi divergence. In UAI 2009, Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence, Montreal, QC, Canada, June 18-21, 2009, pages 367-374, 2009. URL: https://dslpitt.org/uai/displayArticleDetails.jsp?mmnu=1&smnu=2&article_id=1600&proceeding_id=25.
  18. Albert W. Marshall, Ingram Olkin, and Barry C. Arnold. Inequalities : Theory of Majorization and its Applications. Springer Science+Business Media, LLC, New York, 2011. Google Scholar
  19. Abolfazl S. Motahari, Guy Bresler, and David N. C. Tse. Information theory of DNA shotgun sequencing. IEEE Trans. Information Theory, 59(10):6273-6289, 2013. URL: http://dx.doi.org/10.1109/TIT.2013.2270273.
  20. Huzefa Neemuchwala, Alfred O. Hero III, Sakina Zabuawala, and Paul L. Carson. Image registration methods in high-dimensional space. Int. J. Imaging Systems and Technology, 16(5):130-145, 2006. URL: http://dx.doi.org/10.1002/ima.20079.
  21. Dávid Pál, Barnabás Póczos, and Csaba Szepesvári. Estimation of rényi entropy and mutual information based on generalized nearest-neighbor graphs. In Proceedings of the 23rd International Conference on Neural Information Processing Systems, NIPS'10, pages 1849-1857, USA, 2010. Curran Associates Inc. URL: http://dl.acm.org/citation.cfm?id=2997046.2997102.
  22. Liam Paninski. Estimation of entropy and mutual information. Neural Comput., 15(6):1191-1253, June 2003. URL: http://dx.doi.org/10.1162/089976603321780272.
  23. Liam Paninski. A coincidence-based test for uniformity given very sparsely sampled discrete data. IEEE Trans. Information Theory, 54(10):4750-4755, 2008. URL: http://dx.doi.org/10.1109/TIT.2008.928987.
  24. C. E. Pfister and W. G. Sullivan. Rényi entropy, guesswork moments, and large deviations. IEEE Trans. Information Theory, 50(11):2794-2800, 2004. URL: http://dx.doi.org/10.1109/TIT.2004.836665.
  25. A. Renyi. On measures of information and entropy. In Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability, pages 547-561, 1960. URL: http://digitalassets.lib.berkeley.edu/math/ucb/text/math_s4_v1_article-27.pdf.
  26. Prasanna K. Sahoo and Gurdial Arora. A thresholding method based on two-dimensional renyi’s entropy. Pattern Recognition, 37(6):1149-1161, 2004. URL: http://dx.doi.org/10.1016/j.patcog.2003.10.008.
  27. C. E. Shannon. A mathematical theory of communication. SIGMOBILE Mob. Comput. Commun. Rev., 5(1):3-55, January 2001. URL: http://dx.doi.org/10.1145/584091.584093.
  28. Gregory Valiant and Paul Valiant. Estimating the unseen: an n/log(n)-sample estimator for entropy and support size, shown optimal via new clts. In Proceedings of the 43rd ACM Symposium on Theory of Computing, STOC 2011, San Jose, CA, USA, 6-8 June 2011, pages 685-694, 2011. URL: http://dx.doi.org/10.1145/1993636.1993727.
  29. Paul C. van Oorschot and Michael J. Wiener. Parallel collision search with cryptanalytic applications. J. Cryptology, 12(1):1-28, 1999. URL: http://dx.doi.org/10.1007/PL00003816.
  30. Dongxin Xu. Energy, Entropy and Information Potential for Neural Computation. PhD thesis, University of Florida, Gainesville, FL, USA, 1998. AAI9935317. Google Scholar
  31. Dongxin Xu and Deniz Erdogmuns. Renyi’s Entropy, Divergence and Their Nonparametric Estimators, pages 47-102. Springer New York, New York, NY, 2010. URL: http://dx.doi.org/10.1007/978-1-4419-1570-2_2.
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail