LIPIcs.ITCS.2024.99.pdf
- Filesize: 0.8 MB
- 15 pages
We prove the first polynomial separation between randomized and deterministic time-space tradeoffs of multi-output functions. In particular, we present a total function that on the input of n elements in [n], outputs O(n) elements, such that: - There exists a randomized oblivious algorithm with space O(log n), time O(nlog n) and one-way access to randomness, that computes the function with probability 1-O(1/n); - Any deterministic oblivious branching program with space S and time T that computes the function must satisfy T²S ≥ Ω(n^{2.5}/log n). This implies that logspace randomized algorithms for multi-output functions cannot be black-box derandomized without an Ω̃(n^{1/4}) overhead in time. Since previously all the polynomial time-space tradeoffs of multi-output functions are proved via the Borodin-Cook method, which is a probabilistic method that inherently gives the same lower bound for randomized and deterministic branching programs, our lower bound proof is intrinsically different from previous works. We also examine other natural candidates for proving such separations, and show that any polynomial separation for these problems would resolve the long-standing open problem of proving n^{1+Ω(1)} time lower bound for decision problems with polylog(n) space.
Feedback for Dagstuhl Publishing