10 Search Results for "Bitansky, Nir"


Document
Key-Agreement with Perfect Completeness from Random Oracles

Authors: Noam Mazor

Published in: LIPIcs, Volume 343, 6th Conference on Information-Theoretic Cryptography (ITC 2025)


Abstract
In the Random Oracle Model (ROM) all parties have oracle access to a common random function, and the parties are limited in the number of queries they can make to the oracle. The Merkle’s Puzzles protocol, introduced by Merkle [CACM '78], is a key-agreement protocol in the ROM with a quadratic gap between the query complexity of the honest parties and the eavesdropper. This quadratic gap is known to be optimal, by the works of Impagliazzo and Rudich [STOC ’89] and Barak and Mahmoody [Crypto ’09]. When the oracle function is injective or a permutation, Merkle’s Puzzles has perfect completeness. That is, it is certain that the protocol results in agreement between the parties. However, without such an assumption on the random function, there is a small error probability, and the parties may end up holding different keys. This fact raises the question: Is there a key-agreement protocol with perfect completeness and super-linear security in the ROM? In this paper we give a positive answer to the above question, showing that changes to the query distribution of the parties in Merkle’s Puzzles, yield a protocol with perfect completeness and roughly the same security.

Cite as

Noam Mazor. Key-Agreement with Perfect Completeness from Random Oracles. In 6th Conference on Information-Theoretic Cryptography (ITC 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 343, pp. 12:1-12:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{mazor:LIPIcs.ITC.2025.12,
  author =	{Mazor, Noam},
  title =	{{Key-Agreement with Perfect Completeness from Random Oracles}},
  booktitle =	{6th Conference on Information-Theoretic Cryptography (ITC 2025)},
  pages =	{12:1--12:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-385-0},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{343},
  editor =	{Gilboa, Niv},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITC.2025.12},
  URN =		{urn:nbn:de:0030-drops-243628},
  doi =		{10.4230/LIPIcs.ITC.2025.12},
  annote =	{Keywords: Key-Agreement, Random Oracle, Merkle’s Puzzles, Perfect Completeness}
}
Document
Powerful Primitives in the Bounded Quantum Storage Model

Authors: Mohammed Barhoush and Louis Salvail

Published in: LIPIcs, Volume 343, 6th Conference on Information-Theoretic Cryptography (ITC 2025)


Abstract
The bounded quantum storage model aims to achieve security against computationally unbounded adversaries that are restricted only with respect to their quantum memories. In this work, we provide the following contributions in this model: 1) We build one-time programs and utilize them to construct CCA1-secure symmetric key encryption and message authentication codes. These schemes require no quantum memory from honest users, yet they provide information-theoretic security against adversaries with arbitrarily large quantum memories, as long as the transmission length is suitably large. 2) We introduce the notion of k-time program broadcast which is a form of program encryption that allows multiple users to each learn a single evaluation of the encrypted program, while preventing any one user from learning more than k evaluations of the program. We build this primitive unconditionally and employ it to construct CCA1-secure asymmetric key encryption, encryption tokens, signatures, and signature tokens. All these schemes are information-theoretically secure against adversaries with roughly e^√m quantum memory where m is the quantum memory required for the honest user. All of the constructions additionally satisfy disappearing security, essentially preventing an adversary from storing and using a transmission later on.

Cite as

Mohammed Barhoush and Louis Salvail. Powerful Primitives in the Bounded Quantum Storage Model. In 6th Conference on Information-Theoretic Cryptography (ITC 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 343, pp. 2:1-2:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{barhoush_et_al:LIPIcs.ITC.2025.2,
  author =	{Barhoush, Mohammed and Salvail, Louis},
  title =	{{Powerful Primitives in the Bounded Quantum Storage Model}},
  booktitle =	{6th Conference on Information-Theoretic Cryptography (ITC 2025)},
  pages =	{2:1--2:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-385-0},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{343},
  editor =	{Gilboa, Niv},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITC.2025.2},
  URN =		{urn:nbn:de:0030-drops-243523},
  doi =		{10.4230/LIPIcs.ITC.2025.2},
  annote =	{Keywords: Quantum Cryptography, Bounded Quantum Storage Model, Information-Theoretic Security}
}
Document
Track A: Algorithms, Complexity and Games
Boosting SNARKs and Rate-1 Barrier in Arguments of Knowledge

Authors: Jiaqi Cheng and Rishab Goyal

Published in: LIPIcs, Volume 334, 52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025)


Abstract
We design a generic compiler to boost any non-trivial succinct non-interactive argument of knowledge (SNARK) to full succinctness. Our results come in two flavors: 1) For any constant ε > 0, any SNARK with proof size |π| < |ω|/(λ^ε) + poly(λ, |x|) can be upgraded to a fully succinct SNARK, where all system parameters (such as proof/CRS sizes and setup/verifier run-times) grow as fixed polynomials in λ, independent of witness size. 2) Under an additional assumption that the underlying SNARK has as an efficient knowledge extractor, we further improve our result to upgrade any non-trivial SNARK. For example, we show how to design fully succinct SNARKs from SNARKs with proofs of length |ω| - Ω(λ), or |ω|/(1+ε) + poly(λ, |x|), any constant ε > 0. Our result reduces the long-standing challenge of designing fully succinct SNARKs to designing arguments of knowledge that beat the trivial construction. It also establishes optimality of rate-1 arguments of knowledge (such as NIZKs [Gentry-Groth-Ishai-Peikert-Sahai-Smith; JoC'15] and BARGs [Devadas-Goyal-Kalai-Vaikuntanathan, Paneth-Pass; FOCS'22]), and suggests any further improvement is tantamount to designing fully succinct SNARKs, thus requires bypassing established black-box barriers [Gentry-Wichs; STOC'11].

Cite as

Jiaqi Cheng and Rishab Goyal. Boosting SNARKs and Rate-1 Barrier in Arguments of Knowledge. In 52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 334, pp. 56:1-56:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{cheng_et_al:LIPIcs.ICALP.2025.56,
  author =	{Cheng, Jiaqi and Goyal, Rishab},
  title =	{{Boosting SNARKs and Rate-1 Barrier in Arguments of Knowledge}},
  booktitle =	{52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025)},
  pages =	{56:1--56:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-372-0},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{334},
  editor =	{Censor-Hillel, Keren and Grandoni, Fabrizio and Ouaknine, Jo\"{e}l and Puppis, Gabriele},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2025.56},
  URN =		{urn:nbn:de:0030-drops-234339},
  doi =		{10.4230/LIPIcs.ICALP.2025.56},
  annote =	{Keywords: SNARGs, RAM Delegation}
}
Document
Accumulation Without Homomorphism

Authors: Benedikt Bünz, Pratyush Mishra, Wilson Nguyen, and William Wang

Published in: LIPIcs, Volume 325, 16th Innovations in Theoretical Computer Science Conference (ITCS 2025)


Abstract
Accumulation schemes are a simple yet powerful primitive that enable highly efficient constructions of incrementally verifiable computation (IVC). Unfortunately, all prior accumulation schemes rely on homomorphic vector commitments whose security is based on public-key assumptions. It is an interesting open question to construct efficient accumulation schemes that avoid the need for such assumptions. In this paper, we answer this question affirmatively by constructing an accumulation scheme from non-homomorphic vector commitments which can be realized from solely symmetric-key assumptions (e.g., Merkle trees). We overcome the need for homomorphisms by instead performing spot-checks over error-correcting encodings of the committed vectors. Unlike prior accumulation schemes, our scheme only supports a bounded number of accumulation steps. We show that such bounded-depth accumulation still suffices to construct proof-carrying data (a generalization of IVC). We also demonstrate several optimizations to our PCD construction which greatly improve concrete efficiency.

Cite as

Benedikt Bünz, Pratyush Mishra, Wilson Nguyen, and William Wang. Accumulation Without Homomorphism. In 16th Innovations in Theoretical Computer Science Conference (ITCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 325, pp. 23:1-23:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{bunz_et_al:LIPIcs.ITCS.2025.23,
  author =	{B\"{u}nz, Benedikt and Mishra, Pratyush and Nguyen, Wilson and Wang, William},
  title =	{{Accumulation Without Homomorphism}},
  booktitle =	{16th Innovations in Theoretical Computer Science Conference (ITCS 2025)},
  pages =	{23:1--23:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-361-4},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{325},
  editor =	{Meka, Raghu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2025.23},
  URN =		{urn:nbn:de:0030-drops-226510},
  doi =		{10.4230/LIPIcs.ITCS.2025.23},
  annote =	{Keywords: Proof-carrying data, incrementally verifiable computation, accumulation schemes}
}
Document
Incompressible Functional Encryption

Authors: Rishab Goyal, Venkata Koppula, Mahesh Sreekumar Rajasree, and Aman Verma

Published in: LIPIcs, Volume 325, 16th Innovations in Theoretical Computer Science Conference (ITCS 2025)


Abstract
Incompressible encryption (Dziembowski, Crypto'06; Guan, Wichs, Zhandry, Eurocrypt'22) protects from attackers that learn the entire decryption key, but cannot store the full ciphertext. In incompressible encryption, the attacker must try to compress a ciphertext within pre-specified memory bound S before receiving the secret key. In this work, we generalize the notion of incompressibility to functional encryption. In incompressible functional encryption, the adversary can corrupt non-distinguishing keys at any point, but receives the distinguishing keys only after compressing the ciphertext to within S bits. An important efficiency measure for incompressible encryption is the ciphertext-rate (i.e., rate = |m|/|ct|). We give many new results for incompressible functional encryption for circuits, from minimal assumption of (non-incompressible) functional encryption, with 1) ct-rate-1/2 and short secret keys, 2) ct-rate-1 and large secret keys. Along the way, we also give a new incompressible attribute-based encryption for circuits from standard assumptions, with ct-rate-1/2 and short secret keys. Our results achieve optimal efficiency, as incompressible attribute-based/functional encryption with ct-rate-1 as well as short secret keys has strong barriers for provable security from standard assumptions. Moreover, our assumptions are minimal as incompressible attribute-based/functional encryption are strictly stronger than their non-incompressible counterparts.

Cite as

Rishab Goyal, Venkata Koppula, Mahesh Sreekumar Rajasree, and Aman Verma. Incompressible Functional Encryption. In 16th Innovations in Theoretical Computer Science Conference (ITCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 325, pp. 56:1-56:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{goyal_et_al:LIPIcs.ITCS.2025.56,
  author =	{Goyal, Rishab and Koppula, Venkata and Rajasree, Mahesh Sreekumar and Verma, Aman},
  title =	{{Incompressible Functional Encryption}},
  booktitle =	{16th Innovations in Theoretical Computer Science Conference (ITCS 2025)},
  pages =	{56:1--56:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-361-4},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{325},
  editor =	{Meka, Raghu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2025.56},
  URN =		{urn:nbn:de:0030-drops-226849},
  doi =		{10.4230/LIPIcs.ITCS.2025.56},
  annote =	{Keywords: functional encryption, attribute-based encryption, incompressible encryption}
}
Document
Single-Round Proofs of Quantumness from Knowledge Assumptions

Authors: Petia Arabadjieva, Alexandru Gheorghiu, Victor Gitton, and Tony Metger

Published in: LIPIcs, Volume 325, 16th Innovations in Theoretical Computer Science Conference (ITCS 2025)


Abstract
A proof of quantumness is an efficiently verifiable interactive test that an efficient quantum computer can pass, but all efficient classical computers cannot (under some cryptographic assumption). Such protocols play a crucial role in the certification of quantum devices. Existing single-round protocols based solely on a cryptographic hardness assumption (like asking the quantum computer to factor a large number) require large quantum circuits, whereas multi-round ones use smaller circuits but require experimentally challenging mid-circuit measurements. In this work, we construct efficient single-round proofs of quantumness based on existing knowledge assumptions. While knowledge assumptions have not been previously considered in this context, we show that they provide a natural basis for separating classical and quantum computation. Our work also helps in understanding the interplay between black-box/white-box reductions and cryptographic assumptions in the design of proofs of quantumness. Specifically, we show that multi-round protocols based on Decisional Diffie-Hellman (DDH) or Learning With Errors (LWE) can be "compiled" into single-round protocols using a knowledge-of-exponent assumption [Bitansky et al., 2012] or knowledge-of-lattice-point assumption [Loftus et al., 2012], respectively. We also prove an adaptive hardcore-bit statement for a family of claw-free functions based on DDH, which might be of independent interest.

Cite as

Petia Arabadjieva, Alexandru Gheorghiu, Victor Gitton, and Tony Metger. Single-Round Proofs of Quantumness from Knowledge Assumptions. In 16th Innovations in Theoretical Computer Science Conference (ITCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 325, pp. 8:1-8:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{arabadjieva_et_al:LIPIcs.ITCS.2025.8,
  author =	{Arabadjieva, Petia and Gheorghiu, Alexandru and Gitton, Victor and Metger, Tony},
  title =	{{Single-Round Proofs of Quantumness from Knowledge Assumptions}},
  booktitle =	{16th Innovations in Theoretical Computer Science Conference (ITCS 2025)},
  pages =	{8:1--8:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-361-4},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{325},
  editor =	{Meka, Raghu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2025.8},
  URN =		{urn:nbn:de:0030-drops-226364},
  doi =		{10.4230/LIPIcs.ITCS.2025.8},
  annote =	{Keywords: Proofs of quantumness, Knowledge assumptions, Learning with errors, Decisional Diffie-Hellman}
}
Document
The More the Merrier! On Total Coding and Lattice Problems and the Complexity of Finding Multicollisions

Authors: Huck Bennett, Surendra Ghentiyala, and Noah Stephens-Davidowitz

Published in: LIPIcs, Volume 325, 16th Innovations in Theoretical Computer Science Conference (ITCS 2025)


Abstract
We show a number of connections between two types of search problems: (1) the problem of finding an L-wise multicollision in the output of a function; and (2) the problem of finding two codewords in a code (or two vectors in a lattice) that are within distance d of each other. Specifically, we study these problems in the total regime, in which L and d are chosen so that such a solution is guaranteed to exist, though it might be hard to find. In more detail, we study the total search problem in which the input is a function 𝒞 : [A] → [B] (represented as a circuit) and the goal is to find L ≤ ⌈A/B⌉ distinct elements x_1,…, x_L ∈ A such that 𝒞(x_1) = ⋯ = 𝒞(x_L). The associated complexity classes Polynomial Multi-Pigeonhole Principle ((A,B)-PMPP^L) consist of all problems that reduce to this problem. We show close connections between (A,B)-PMPP^L and many celebrated upper bounds on the minimum distance of a code or lattice (and on the list-decoding radius). In particular, we show that the associated computational problems (i.e., the problem of finding two distinct codewords or lattice points that are close to each other) are in (A,B)-PMPP^L, with a more-or-less smooth tradeoff between the distance d and the parameters A, B, and L. These connections are particularly rich in the case of codes, in which case we show that multiple incomparable bounds on the minimum distance lie in seemingly incomparable complexity classes. Surprisingly, we also show that the computational problems associated with some bounds on the minimum distance of codes are actually hard for these classes (for codes represented by arbitrary circuits). In fact, we show that finding two vectors within a certain distance d is actually hard for the important (and well-studied) class PWPP = (B²,B)-PMPP² in essentially all parameter regimes for which an efficient algorithm is not known, so that our hardness results are essentially tight. In fact, for some d (depending on the block length, message length, and alphabet size), we obtain both hardness and containment. We therefore completely settle the complexity of this problem for such parameters and add coding problems to the short list of problems known to be complete for PWPP. We also study (A,B)-PMPP^L as an interesting family of complexity classes in its own right, and we uncover a rich structure. Specifically, we use recent techniques from the cryptographic literature on multicollision-resistant hash functions to (1) show inclusions of the form (A,B)-PMPP^L ⊆ (A',B')-PMPP^L' for certain non-trivial parameters; (2) black-box separations between such classes in different parameter regimes; and (3) a non-black-box proof that (A,B)-PMPP^L ∈ FP if (A',B')-PMPP^L' ∈ FP for yet another parameter regime. We also show that (A,B)-PMPP^L lies in the recently introduced complexity class Polynomial Long Choice for some parameters.

Cite as

Huck Bennett, Surendra Ghentiyala, and Noah Stephens-Davidowitz. The More the Merrier! On Total Coding and Lattice Problems and the Complexity of Finding Multicollisions. In 16th Innovations in Theoretical Computer Science Conference (ITCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 325, pp. 14:1-14:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{bennett_et_al:LIPIcs.ITCS.2025.14,
  author =	{Bennett, Huck and Ghentiyala, Surendra and Stephens-Davidowitz, Noah},
  title =	{{The More the Merrier! On Total Coding and Lattice Problems and the Complexity of Finding Multicollisions}},
  booktitle =	{16th Innovations in Theoretical Computer Science Conference (ITCS 2025)},
  pages =	{14:1--14:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-361-4},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{325},
  editor =	{Meka, Raghu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2025.14},
  URN =		{urn:nbn:de:0030-drops-226424},
  doi =		{10.4230/LIPIcs.ITCS.2025.14},
  annote =	{Keywords: Multicollisions, Error-correcting codes, Lattices}
}
Document
Bootstrapping Homomorphic Encryption via Functional Encryption

Authors: Nir Bitansky and Tomer Solomon

Published in: LIPIcs, Volume 251, 14th Innovations in Theoretical Computer Science Conference (ITCS 2023)


Abstract
Homomorphic encryption is a central object in modern cryptography, with far-reaching applications. Constructions supporting homomorphic evaluation of arbitrary Boolean circuits have been known for over a decade, based on standard lattice assumptions. However, these constructions are leveled, meaning that they only support circuits up to some a-priori bounded depth. These leveled constructions can be bootstrapped into fully homomorphic ones, but this requires additional circular security assumptions, which are construction-dependent, and where reductions to standard lattice assumptions are no longer known. Alternative constructions are known based on indistinguishability obfuscation, which has been recently constructed under standard assumptions. However, this alternative requires subexponential hardness of the underlying primitives. We prove a new bootstrapping theorem based on functional encryption, which is known based on standard polynomial hardness assumptions. As a result we obtain the first fully homomorphic encryption scheme that avoids both circular security assumptions and super-polynomial hardness assumptions. The construction is secure against uniform adversaries, and can be made non-uniformly secure assuming a generalization of the time-hierarchy theorem, which follows for example from non-uniform ETH. At the heart of the construction is a new proof technique based on cryptographic puzzles and decomposable obfuscation. Unlike most cryptographic reductions, our security reduction does not fully treat the adversary as a black box, but rather makes explicit use of its running time (or circuit size).

Cite as

Nir Bitansky and Tomer Solomon. Bootstrapping Homomorphic Encryption via Functional Encryption. In 14th Innovations in Theoretical Computer Science Conference (ITCS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 251, pp. 17:1-17:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bitansky_et_al:LIPIcs.ITCS.2023.17,
  author =	{Bitansky, Nir and Solomon, Tomer},
  title =	{{Bootstrapping Homomorphic Encryption via Functional Encryption}},
  booktitle =	{14th Innovations in Theoretical Computer Science Conference (ITCS 2023)},
  pages =	{17:1--17:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-263-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{251},
  editor =	{Tauman Kalai, Yael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2023.17},
  URN =		{urn:nbn:de:0030-drops-175200},
  doi =		{10.4230/LIPIcs.ITCS.2023.17},
  annote =	{Keywords: Fully Homomorphic Encryption, Polynomial Assumptions, Cryptographic Puzzles}
}
Document
On the Cryptographic Hardness of Local Search

Authors: Nir Bitansky and Idan Gerichter

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
We show new hardness results for the class of Polynomial Local Search problems (PLS): - Hardness of PLS based on a falsifiable assumption on bilinear groups introduced by Kalai, Paneth, and Yang (STOC 2019), and the Exponential Time Hypothesis for randomized algorithms. Previous standard model constructions relied on non-falsifiable and non-standard assumptions. - Hardness of PLS relative to random oracles. The construction is essentially different than previous constructions, and in particular is unconditionally secure. The construction also demonstrates the hardness of parallelizing local search. The core observation behind the results is that the unique proofs property of incrementally-verifiable computations previously used to demonstrate hardness in PLS can be traded with a simple incremental completeness property.

Cite as

Nir Bitansky and Idan Gerichter. On the Cryptographic Hardness of Local Search. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 6:1-6:29, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bitansky_et_al:LIPIcs.ITCS.2020.6,
  author =	{Bitansky, Nir and Gerichter, Idan},
  title =	{{On the Cryptographic Hardness of Local Search}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{6:1--6:29},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.6},
  URN =		{urn:nbn:de:0030-drops-116918},
  doi =		{10.4230/LIPIcs.ITCS.2020.6},
  annote =	{Keywords: Cryptography, PLS, Lower Bounds, Incremental Computation}
}
Document
On Oblivious Amplification of Coin-Tossing Protocols

Authors: Nir Bitansky and Nathan Geier

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
We consider the problem of amplifying two-party coin-tossing protocols: given a protocol where it is possible to bias the common output by at most ρ, we aim to obtain a new protocol where the output can be biased by at most ρ* < ρ. We rule out the existence of a natural type of amplifiers called oblivious amplifiers for every ρ* < ρ. Such amplifiers ignore the way that the underlying ρ-bias protocol works and can only invoke an oracle that provides ρ-bias bits. We provide two proofs of this impossibility. The first is by a reduction to the impossibility of deterministic randomness extraction from Santha-Vazirani sources. The second is a direct proof that is more general and also rules outs certain types of asymmetric amplification. In addition, it gives yet another proof for the Santha-Vazirani impossibility.

Cite as

Nir Bitansky and Nathan Geier. On Oblivious Amplification of Coin-Tossing Protocols. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 30:1-30:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bitansky_et_al:LIPIcs.ITCS.2020.30,
  author =	{Bitansky, Nir and Geier, Nathan},
  title =	{{On Oblivious Amplification of Coin-Tossing Protocols}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{30:1--30:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.30},
  URN =		{urn:nbn:de:0030-drops-117150},
  doi =		{10.4230/LIPIcs.ITCS.2020.30},
  annote =	{Keywords: Coin Tossing, Amplification, Lower Bound, Santha Vazirani}
}
  • Refine by Type
  • 10 Document/PDF
  • 7 Document/HTML

  • Refine by Publication Year
  • 7 2025
  • 1 2023
  • 2 2020

  • Refine by Author
  • 3 Bitansky, Nir
  • 2 Goyal, Rishab
  • 1 Arabadjieva, Petia
  • 1 Barhoush, Mohammed
  • 1 Bennett, Huck
  • Show More...

  • Refine by Series/Journal
  • 10 LIPIcs

  • Refine by Classification
  • 4 Theory of computation → Cryptographic protocols
  • 3 Theory of computation → Computational complexity and cryptography
  • 2 Security and privacy → Information-theoretic techniques
  • 1 Security and privacy → Public key encryption
  • 1 Theory of computation → Oracles and decision trees

  • Refine by Keyword
  • 1 Amplification
  • 1 Bounded Quantum Storage Model
  • 1 Coin Tossing
  • 1 Cryptographic Puzzles
  • 1 Cryptography
  • Show More...

Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail