An Almost Optimal Algorithm for Unbounded Search with Noisy Information

Authors Junhao Gan , Anthony Wirth , Xin Zhang

Thumbnail PDF


  • Filesize: 0.87 MB
  • 15 pages

Document Identifiers

Author Details

Junhao Gan
  • School of Computing and Information Systems, The University of Melbourne, Australia
Anthony Wirth
  • School of Computing and Information Systems, The University of Melbourne, Australia
Xin Zhang
  • School of Computing and Information Systems, The University of Melbourne, Australia


We thank Przemysław Uznański and Dariusz Dereniowski for the discussions on noisy search, particularly for highlighting the nuances of the binary and ternary comparison models.

Cite AsGet BibTex

Junhao Gan, Anthony Wirth, and Xin Zhang. An Almost Optimal Algorithm for Unbounded Search with Noisy Information. In 18th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 227, pp. 25:1-25:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Given a sequence of integers, 𝒮 = s₁, s₂,… in ascending order, called the search domain, and an integer t, called the target, the predecessor problem asks for the target index N such that s_N is the largest integer in 𝒮 satisfying s_N ≤ t. We consider solving the predecessor problem with the least number of queries to a binary comparison oracle. For each query index i, the oracle returns whether s_i ≤ t or s_i > t. In particular, we study the predecessor problem under the UnboundedNoisy setting, where (i) the search domain 𝒮 is unbounded, i.e., n = |𝒮| is unknown or infinite, and (ii) the binary comparison oracle is noisy. We denote the former setting by Unbounded and the latter by Noisy. In Noisy, the oracle, for each query, independently returns a wrong answer with a fixed constant probability 0 < p < 1/2. In particular, even for two queries on the same index i, the answers from the oracle may be different. Furthermore, with a noisy oracle, the goal is to correctly return the target index with probability at least 1- Q, where 0 < Q < 1/2 is the failure probability. Our first result is an algorithm, called NoS, for Noisy that improves the previous result by Ben-Or and Hassidim [FOCS 2008] from an expected query complexity bound to a worst-case bound. We also achieve an expected query complexity bound, whose leading term has an optimal constant factor, matching the lower bound of Ben-Or and Hassidim. Building on NoS, we propose our NoSU algorithm, which correctly solves the predecessor problem in the UnboundedNoisy setting. We prove that the query complexity of NoSU is ∑_{i = 1}^k (log^{(i)} N) /(1-H(p))+ o(log N) when log Q^{-1} ∈ o(log N), where N is the target index, k = log^* N, the iterated logarithm, and H(p) is the entropy function. This improves the previous bound of O(log (N/Q) / (1-H(p))) by reducing the coefficient of the leading term from a large constant to 1. Moreover, we show that this upper bound can be further improved to (1 - Q) ∑_{i = 1}^k (log^{(i)} N) /(1-H(p))+ o(log N) in expectation, with the constant in the leading term reduced to 1 - Q. Finally, we show that an information-theoretic lower bound on the expected query cost of the predecessor problem in UnboundedNoisy is at least (1 - Q)(∑_{i = 1}^k log^{(i)} N - 2k)/(1-H(p)) - 10. This implies the constant factor in the leading term of our expected upper bound is indeed optimal.

Subject Classification

ACM Subject Classification
  • Theory of computation → Predecessor queries
  • Theory of computation → Sorting and searching
  • Fault-tolerant search
  • noisy binary search
  • query complexity


  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    PDF Downloads


  1. Javed A Aslam and Aditi Dhagat. Searching in the presence of linearly bounded errors. In Proc. 23rd ACM Symposium on Theory of Computing, pages 486-493, 1991. Google Scholar
  2. Paul Beame and Faith E Fich. Optimal bounds for the predecessor problem and related problems. Journal of Computer and System Sciences, 65(1):38-72, 2002. Google Scholar
  3. Richard Beigel. Unbounded searching algorithms. SIAM Journal on Computing, 19(3):522-537, 1990. Google Scholar
  4. Michael Ben-Or and Avinatan Hassidim. The Bayesian learner is optimal for noisy binary search (and pretty good for quantum as well). In Proc. 49th IEEE Symposium on Foundations of Computer Science, pages 221-230, 2008. Google Scholar
  5. Jon Louis Bentley and Andrew Chi-Chih Yao. An almost optimal algorithm for unbounded searching. Information Processing Letters, 5(SLAC-PUB-1679), 1976. Google Scholar
  6. Ryan S Borgstrom and S Rao Kosaraju. Comparison-based search in the presence of errors. In Proc. 25th ACM Symposium on Theory of Computing, pages 130-136, 1993. Google Scholar
  7. Ferdinando Cicalese. Fault-Tolerant Search Algorithms: Reliable Computation with Unreliable Information. Springer, 2013. Google Scholar
  8. Sanjoy Dasgupta. Analysis of a greedy active learning strategy. In Advances in Neural Information Processing Systems, volume 17, pages 337-344, 2005. Google Scholar
  9. Christian Deppe. Coding with feedback and searching with lies. In Imre Csiszár, Gyula OH Katona, and Gábor Tardos, editors, Entropy, Search, Complexity, pages 27-70. Springer, 2007. Google Scholar
  10. Dariusz Dereniowski, Aleksander Łukasiewicz, and Przemysław Uznański. Noisy searching: simple, fast and correct. arXiv preprint, 2021. URL:
  11. Dariusz Dereniowski, Stefan Tiegel, Przemysław Uznański, and Daniel Wolleb-Graf. A framework for searching in graphs in the presence of errors. arXiv preprint, 2018. URL:
  12. Dariusz Dereniowski, Stefan Tiegel, Przemyslaw Uznanski, and Daniel Wolleb-Graf. A framework for searching in graphs in the presence of errors. In Proc. 2nd Symposium on Simplicity in Algorithms, volume 69, pages 4:1-4:17, 2019. Google Scholar
  13. Aditi Dhagat, Peter Gács, and Peter Winkler. On playing "twenty questions" with a liar. In Proc. 3rd ACM/SIAM Symposium on Discrete Algorithms, volume 92, pages 16-22, 1992. Google Scholar
  14. Ehsan Emamjomeh-Zadeh and David Kempe. A general framework for robust interactive learning. In Proc. 31st International Conference on Neural Information Processing Systems, pages 7082-7091, 2017. Google Scholar
  15. Ehsan Emamjomeh-Zadeh, David Kempe, and Vikrant Singhal. Deterministic and probabilistic binary search in graphs. In Proc. 48th ACM Symposium on Theory of Computing, pages 519-532, 2016. Google Scholar
  16. Narthana S Epa, Junhao Gan, and Anthony Wirth. Result-sensitive binary search with noisy information. In Proc. 30th International Symposium on Algorithms and Computation, pages 60:1-60:15, 2019. Google Scholar
  17. Uriel Feige, Prabhakar Raghavan, David Peleg, and Eli Upfal. Computing with noisy information. SIAM Journal on Computing, 23(5):1001-1018, 1994. Google Scholar
  18. Daniel Golovin, Andreas Krause, and Debajyoti Ray. Near-optimal bayesian active learning with noisy observations. In Advances in Neural Information Processing Systems, volume 23, pages 766-774, 2010. Google Scholar
  19. Richard M Karp and Robert Kleinberg. Noisy binary search and its applications. In Proc. 18th ACM-SIAM Symposium on Discrete Algorithms, pages 881-890, 2007. Google Scholar
  20. Yong Sin Kim and Roland Winston. Unbounded binary search for a fast and accurate maximum power point tracking. In AIP Conference Proceedings, volume 1407, pages 289-292, 2011. Google Scholar
  21. Amos Korman, Jean-Sébastien Sereni, and Laurent Viennot. Toward more localized local algorithms: removing assumptions concerning global knowledge. Distributed Computing, 26(5-6):289-308, 2013. Google Scholar
  22. Robert Nowak. Generalized binary search. In Proc. 46th IEEE Allerton Conference on Communication, Control, and Computing, pages 568-574, 2008. Google Scholar
  23. Mihai Pătraşcu and Mikkel Thorup. Time-space trade-offs for predecessor search. In Proc. 38th ACM Symposium on Theory of Computing, pages 232-240, 2006. Google Scholar
  24. Andrzej Pelc. Searching with known error probability. Theoretical Computer Science, 63(2):185-202, 1989. Google Scholar
  25. Andrzej Pelc. Searching games with errors - fifty years of coping with liars. Theoretical Computer Science, 270(1-2):71-109, 2002. Google Scholar
  26. Alfréd Rényi. On a problem of information theory. MTA Mat. Kut. Int. Kozl. B, 6(MR143666):505-516, 1961. Google Scholar
  27. Alfréd Rényi. On the foundations of information theory. Revue de l'Institut International de Statistique, pages 1-14, 1965. Google Scholar
  28. Alfréd Rényi and Zsuzsanna Makkai-Bencsáth. A Diary on Information Theory. Akadémiai Kiadó Budapest, 1984. Google Scholar
  29. Ronald L. Rivest, Albert R. Meyer, Daniel J. Kleitman, Karl Winklmann, and Joel Spencer. Coping with errors in binary search procedures. Journal of Computer and System Sciences, 20(3):396-404, 1980. Google Scholar
  30. Michael Saks and Avi Wigderson. Probabilistic Boolean decision trees and the complexity of evaluating game trees. In Proc. 27th IEEE Symposium on Foundations of Computer Science, pages 29-38, 1986. Google Scholar
  31. Pranab Sen. Lower bounds for predecessor searching in the cell probe model. In Proc. 18th IEEE Conference on Computational Complexity, pages 73-83, 2003. Google Scholar
  32. Sanjay Kumar Singh and Amarjeet Singh. Software testing. Vandana Publications, 2012. Google Scholar
  33. John A Stankovic. Research directions for the internet of things. IEEE Internet of Things Journal, 1(1):3-9, 2014. Google Scholar
  34. Stanislaw M Ulam. Adventures of a Mathematician. University of California Press, 1991. Google Scholar
Questions / Remarks / Feedback

Feedback for Dagstuhl Publishing

Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail