Beating Fredman-Komlós for Perfect k-Hashing

Authors Venkatesan Guruswami , Andrii Riazanov



PDF
Thumbnail PDF

File

LIPIcs.ICALP.2019.92.pdf
  • Filesize: 484 kB
  • 14 pages

Document Identifiers

Author Details

Venkatesan Guruswami
  • Computer Science Department, Carnegie Mellon University, 5000 Forbes Ave, Pittsburgh, PA, USA, 15213
Andrii Riazanov
  • Computer Science Department, Carnegie Mellon University, 5000 Forbes Ave, Pittsburgh, PA, USA, 15213

Acknowledgements

Some of this work was done when the first author was visiting the School of Physical and Mathematical Sciences, Nanyang Technological University, Singapore and the Center of Mathematical Sciences and Applications, Harvard University.

Cite As Get BibTex

Venkatesan Guruswami and Andrii Riazanov. Beating Fredman-Komlós for Perfect k-Hashing. In 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 132, pp. 92:1-92:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019) https://doi.org/10.4230/LIPIcs.ICALP.2019.92

Abstract

We say a subset C subseteq {1,2,...,k}^n is a k-hash code (also called k-separated) if for every subset of k codewords from C, there exists a coordinate where all these codewords have distinct values. Understanding the largest possible rate (in bits), defined as (log_2 |C|)/n, of a k-hash code is a classical problem. It arises in two equivalent contexts: (i) the smallest size possible for a perfect hash family that maps a universe of N elements into {1,2,...,k}, and (ii) the zero-error capacity for decoding with lists of size less than k for a certain combinatorial channel.
 A general upper bound of k!/k^{k-1} on the rate of a k-hash code (in the limit of large n) was obtained by Fredman and Komlós in 1984 for any k >= 4. While better bounds have been obtained for k=4, their original bound has remained the best known for each k >= 5. In this work, we present a method to obtain the first improvement to the Fredman-Komlós bound for every k >= 5, and we apply this method to give explicit numerical bounds for k=5, 6.

Subject Classification

ACM Subject Classification
  • Theory of computation → Error-correcting codes
  • Mathematics of computing → Coding theory
Keywords
  • Coding theory
  • perfect hashing
  • hash family
  • graph entropy
  • zero-error information theory

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. E. Arikan. A Bound on the Zero-Error List Coding Capacity. In Proceedings. IEEE International Symposium on Information Theory, pages 152-152, 1993. URL: http://dx.doi.org/10.1109/ISIT.1993.748467.
  2. E. Arikan. An upper bound on the zero-error list-coding capacity. IEEE Transactions on Information Theory, 40(4):1237-1240, 1994. URL: http://dx.doi.org/10.1109/18.335947.
  3. M. Dalai, V. Guruswami, and J. Radhakrishnan. An improved bound on the zero-error list-decoding capacity of the 4/3 channel. In 2017 IEEE International Symposium on Information Theory (ISIT), pages 1658-1662, 2017. URL: http://dx.doi.org/10.1109/ISIT.2017.8006811.
  4. Peter Elias. Zero error capacity under list decoding. IEEE Trans. Information Theory, 34(5):1070-1074, 1988. URL: http://dx.doi.org/10.1109/18.21233.
  5. Michael L. Fredman and János Komlós. On the Size of Separating Systems and Families of Perfect Hash Functions. SIAM Journal on Algebraic Discrete Methods, 5(1):61-68, 1984. URL: http://dx.doi.org/10.1137/0605009.
  6. G. Hansel. Nombre minimal de contacts de fermature nécessaires pour réaliser une fonction booléenne symétrique de n variables. C. R. Acad. Sci. Paris, pages 6037-6040, 1964. Google Scholar
  7. Wolfram Research, Inc. Mathematica, Version 11.3. Champaign, IL, 2018. Google Scholar
  8. J. Körner. Coding of an information source having ambiguous alphabet and the entropy of graphs. 6th Prague Conference on Information Theory, pages 411-425, 1973. Google Scholar
  9. J. Körner and K. Marton. New Bounds for Perfect Hashing via Information Theory. European Journal of Combinatorics, 9(6):523-530, 1988. URL: http://dx.doi.org/10.1016/s0195-6698(88)80048-9.
  10. János Körner. FredmanendashKomlós bounds and information theory. SIAM Journal on Algebraic Discrete Methods, 7(4):560-570, 1986. URL: http://dx.doi.org/10.1137/0607062.
  11. A. Nilli. Perfect Hashing and Probability. Combinatorics, Probability and Computing, 3(03):407-409, 1994. URL: http://dx.doi.org/10.1017/s0963548300001280.
  12. Jaikumar Radhakrishnan. Entropy and Counting, 2001. URL: http://www.tcs.tifr.res.in/~jaikumar/Papers/EntropyAndCounting.pdf.
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail