10 Search Results for "Mahmoody, Mohammad"


Document
New Algorithmic Directions in Optimal Transport and Applications for Product Spaces

Authors: Salman Beigi, Omid Etesami, Mohammad Mahmoody, and Amir Najafi

Published in: LIPIcs, Volume 359, 36th International Symposium on Algorithms and Computation (ISAAC 2025)


Abstract
We consider the problem of optimal transport between two high-dimensional distributions μ,ν in ℝⁿ from a new algorithmic perspective, in which we are given a sample x ∼ μ and we have to find a close y ∼ ν while running in poly(n) time, where n is the size/dimension of x,y. In other words, we are interested in making the running time bounded in dimension of the spaces rather than bounded in the total size of the representations of the two distributions. Our main result is a general algorithmic transport result between any product distribution μ and an arbitrary distribution ν of total cost Δ + δ under 𝓁_p^p cost; here Δ is the cost of the so-called Knothe–Rosenblatt transport from μ to ν, while δ is a computational error that goes to zero for larger running time in the transport algorithm. For this result, we need ν to be "sequentially samplable" with a "bounded average sampling cost" which is a novel but natural notion of independent interest. In addition, we prove the following. - We prove an algorithmic version of the celebrated Talagrand’s inequality for transporting the standard Gaussian distribution Φⁿ to an arbitrary ν under the Euclidean-squared cost. When ν is Φⁿ conditioned on a set S of measure ε, we show how to implement the needed sequential sampler for ν in expected time poly(n/ε), using membership oracle access to S. Hence, we obtain an algorithmic transport that maps Φⁿ to Φⁿ|S in time poly(n/ε) and expected Euclidean-squared distance O(log 1/ε), which is optimal for a general set S of measure ε. - As corollary, we find the first computational concentration (Etesami et al. SODA 2020) result for the Gaussian measure under the Euclidean distance with a dimension-independent transportation cost, resolving a question of Etesami et al. More precisely, for any set S of Gaussian measure ε, we map most of Φⁿ samples to S with Euclidean distance O(√{log 1/ε}) in time poly(n/ε).

Cite as

Salman Beigi, Omid Etesami, Mohammad Mahmoody, and Amir Najafi. New Algorithmic Directions in Optimal Transport and Applications for Product Spaces. In 36th International Symposium on Algorithms and Computation (ISAAC 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 359, pp. 10:1-10:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{beigi_et_al:LIPIcs.ISAAC.2025.10,
  author =	{Beigi, Salman and Etesami, Omid and Mahmoody, Mohammad and Najafi, Amir},
  title =	{{New Algorithmic Directions in Optimal Transport and Applications for Product Spaces}},
  booktitle =	{36th International Symposium on Algorithms and Computation (ISAAC 2025)},
  pages =	{10:1--10:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-408-6},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{359},
  editor =	{Chen, Ho-Lin and Hon, Wing-Kai and Tsai, Meng-Tsung},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2025.10},
  URN =		{urn:nbn:de:0030-drops-249187},
  doi =		{10.4230/LIPIcs.ISAAC.2025.10},
  annote =	{Keywords: Optimal transport, Randomized algorithms, Concentration bounds}
}
Document
APPROX
A Randomized Rounding Approach for DAG Edge Deletion

Authors: Sina Kalantarzadeh, Nathan Klein, and Victor Reis

Published in: LIPIcs, Volume 353, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025)


Abstract
In the DAG Edge Deletion problem, we are given an edge-weighted directed acyclic graph and a parameter k, and the goal is to delete the minimum weight set of edges so that the resulting graph has no paths of length k. This problem, which has applications to scheduling, was introduced in 2015 by Kenkre, Pandit, Purohit, and Saket. They gave a k-approximation and showed that it is UGC-Hard to approximate better than ⌊0.5k⌋ for any constant k ≥ 4 using a work of Svensson from 2012. The approximation ratio was improved to 2/3(k+1) by Klein and Wexler in 2016. In this work, we introduce a randomized rounding framework based on distributions over vertex labels in [0,1]. The most natural distribution is to sample labels independently from the uniform distribution over [0,1]. We show this leads to a (2-√2)(k+1) ≈ 0.585(k+1)-approximation. By using a modified (but still independent) label distribution, we obtain a 0.549(k+1)-approximation for the problem, as well as show that no independent distribution over labels can improve our analysis to below 0.542(k+1). Finally, we show a 0.5(k+1)-approximation for bipartite graphs and for instances with structured LP solutions. Whether this ratio can be obtained in general is open.

Cite as

Sina Kalantarzadeh, Nathan Klein, and Victor Reis. A Randomized Rounding Approach for DAG Edge Deletion. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 353, pp. 18:1-18:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{kalantarzadeh_et_al:LIPIcs.APPROX/RANDOM.2025.18,
  author =	{Kalantarzadeh, Sina and Klein, Nathan and Reis, Victor},
  title =	{{A Randomized Rounding Approach for DAG Edge Deletion}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2025)},
  pages =	{18:1--18:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-397-3},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{353},
  editor =	{Ene, Alina and Chattopadhyay, Eshan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2025.18},
  URN =		{urn:nbn:de:0030-drops-243840},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2025.18},
  annote =	{Keywords: Approximation Algorithms, Randomized Algorithms, Linear Programming, Graph Algorithms, Scheduling}
}
Document
Key-Agreement with Perfect Completeness from Random Oracles

Authors: Noam Mazor

Published in: LIPIcs, Volume 343, 6th Conference on Information-Theoretic Cryptography (ITC 2025)


Abstract
In the Random Oracle Model (ROM) all parties have oracle access to a common random function, and the parties are limited in the number of queries they can make to the oracle. The Merkle’s Puzzles protocol, introduced by Merkle [CACM '78], is a key-agreement protocol in the ROM with a quadratic gap between the query complexity of the honest parties and the eavesdropper. This quadratic gap is known to be optimal, by the works of Impagliazzo and Rudich [STOC ’89] and Barak and Mahmoody [Crypto ’09]. When the oracle function is injective or a permutation, Merkle’s Puzzles has perfect completeness. That is, it is certain that the protocol results in agreement between the parties. However, without such an assumption on the random function, there is a small error probability, and the parties may end up holding different keys. This fact raises the question: Is there a key-agreement protocol with perfect completeness and super-linear security in the ROM? In this paper we give a positive answer to the above question, showing that changes to the query distribution of the parties in Merkle’s Puzzles, yield a protocol with perfect completeness and roughly the same security.

Cite as

Noam Mazor. Key-Agreement with Perfect Completeness from Random Oracles. In 6th Conference on Information-Theoretic Cryptography (ITC 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 343, pp. 12:1-12:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{mazor:LIPIcs.ITC.2025.12,
  author =	{Mazor, Noam},
  title =	{{Key-Agreement with Perfect Completeness from Random Oracles}},
  booktitle =	{6th Conference on Information-Theoretic Cryptography (ITC 2025)},
  pages =	{12:1--12:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-385-0},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{343},
  editor =	{Gilboa, Niv},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITC.2025.12},
  URN =		{urn:nbn:de:0030-drops-243628},
  doi =		{10.4230/LIPIcs.ITC.2025.12},
  annote =	{Keywords: Key-Agreement, Random Oracle, Merkle’s Puzzles, Perfect Completeness}
}
Document
Backdoor Defense, Learnability and Obfuscation

Authors: Paul Christiano, Jacob Hilton, Victor Lecomte, and Mark Xu

Published in: LIPIcs, Volume 325, 16th Innovations in Theoretical Computer Science Conference (ITCS 2025)


Abstract
We introduce a formal notion of defendability against backdoors using a game between an attacker and a defender. In this game, the attacker modifies a function to behave differently on a particular input known as the "trigger", while behaving the same almost everywhere else. The defender then attempts to detect the trigger at evaluation time. If the defender succeeds with high enough probability, then the function class is said to be defendable. The key constraint on the attacker that makes defense possible is that the attacker’s strategy must work for a randomly-chosen trigger. Our definition is simple and does not explicitly mention learning, yet we demonstrate that it is closely connected to learnability. In the computationally unbounded setting, we use a voting algorithm of [Hanneke et al., 2022] to show that defendability is essentially determined by the VC dimension of the function class, in much the same way as PAC learnability. In the computationally bounded setting, we use a similar argument to show that efficient PAC learnability implies efficient defendability, but not conversely. On the other hand, we use indistinguishability obfuscation to show that the class of polynomial size circuits is not efficiently defendable. Finally, we present polynomial size decision trees as a natural example for which defense is strictly easier than learning. Thus, we identify efficient defendability as a notable intermediate concept in between efficient learnability and obfuscation.

Cite as

Paul Christiano, Jacob Hilton, Victor Lecomte, and Mark Xu. Backdoor Defense, Learnability and Obfuscation. In 16th Innovations in Theoretical Computer Science Conference (ITCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 325, pp. 38:1-38:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{christiano_et_al:LIPIcs.ITCS.2025.38,
  author =	{Christiano, Paul and Hilton, Jacob and Lecomte, Victor and Xu, Mark},
  title =	{{Backdoor Defense, Learnability and Obfuscation}},
  booktitle =	{16th Innovations in Theoretical Computer Science Conference (ITCS 2025)},
  pages =	{38:1--38:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-361-4},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{325},
  editor =	{Meka, Raghu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2025.38},
  URN =		{urn:nbn:de:0030-drops-226662},
  doi =		{10.4230/LIPIcs.ITCS.2025.38},
  annote =	{Keywords: backdoors, machine learning, PAC learning, indistinguishability obfuscation}
}
Document
Simultaneous Haar Indistinguishability with Applications to Unclonable Cryptography

Authors: Prabhanjan Ananth, Fatih Kaleoglu, and Henry Yuen

Published in: LIPIcs, Volume 325, 16th Innovations in Theoretical Computer Science Conference (ITCS 2025)


Abstract
We study a novel question about nonlocal quantum state discrimination: how well can non-communicating - but entangled - players distinguish between different distributions over quantum states? We call this task simultaneous state indistinguishability. Our main technical result is to show that the players cannot distinguish between each player receiving independently-chosen Haar random states versus all players receiving the same Haar random state. We show that this question has implications to unclonable cryptography, which leverages the no-cloning principle to build cryptographic primitives that are classically impossible to achieve. Understanding the feasibility of unclonable encryption, one of the key unclonable primitives, satisfying indistinguishability security in the plain model has been a major open question in the area. So far, the existing constructions of unclonable encryption are either in the quantum random oracle model or are based on new conjectures. We leverage our main result to present the first construction of unclonable encryption satisfying indistinguishability security, with quantum decryption keys, in the plain model. We also show other implications to single-decryptor encryption and leakage-resilient secret sharing. These applications present evidence that simultaneous Haar indistinguishability could be useful in quantum cryptography.

Cite as

Prabhanjan Ananth, Fatih Kaleoglu, and Henry Yuen. Simultaneous Haar Indistinguishability with Applications to Unclonable Cryptography. In 16th Innovations in Theoretical Computer Science Conference (ITCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 325, pp. 7:1-7:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{ananth_et_al:LIPIcs.ITCS.2025.7,
  author =	{Ananth, Prabhanjan and Kaleoglu, Fatih and Yuen, Henry},
  title =	{{Simultaneous Haar Indistinguishability with Applications to Unclonable Cryptography}},
  booktitle =	{16th Innovations in Theoretical Computer Science Conference (ITCS 2025)},
  pages =	{7:1--7:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-361-4},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{325},
  editor =	{Meka, Raghu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2025.7},
  URN =		{urn:nbn:de:0030-drops-226352},
  doi =		{10.4230/LIPIcs.ITCS.2025.7},
  annote =	{Keywords: Quantum, Haar, unclonable encryption}
}
Document
Toward the Impossibility of Perfect Complete Quantum PKE from OWFs

Authors: Longcheng Li, Qian Li, Xingjian Li, and Qipeng Liu

Published in: LIPIcs, Volume 325, 16th Innovations in Theoretical Computer Science Conference (ITCS 2025)


Abstract
In this paper, we study the impossibility of constructing perfect complete quantum public key encryption (QPKE) from quantumly secure one-way functions (OWFs) in a black-box manner. We show that this problem is connected to a fundamental conjecture about the roots of low-degree polynomials on the Boolean hypercube. Informally, the conjecture asserts that for every nonconstant low-degree polynomial, there exists a universal (randomized) way to modify a small number of input bits such that, for every input string, the polynomial evaluated on the modified input string avoids 0 with sufficiently large probability (over the choice of how the input string is modified). Assuming this conjecture, we demonstrate the impossibility of constructing QPKE from quantumly secure one-way functions in a black-box manner, by employing the information-theoretical approach recently developed by Li, Li, Li, and Liu (CRYPTO'24). Towards resolving this conjecture, we provide various pieces of evidence supporting it and prove some special cases. In particular, we fully rule out perfect QPKE from OWFs when the key generation algorithm only makes a logarithmic number of quantum queries, improving the previous work, which can only handle classical queries.

Cite as

Longcheng Li, Qian Li, Xingjian Li, and Qipeng Liu. Toward the Impossibility of Perfect Complete Quantum PKE from OWFs. In 16th Innovations in Theoretical Computer Science Conference (ITCS 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 325, pp. 71:1-71:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{li_et_al:LIPIcs.ITCS.2025.71,
  author =	{Li, Longcheng and Li, Qian and Li, Xingjian and Liu, Qipeng},
  title =	{{Toward the Impossibility of Perfect Complete Quantum PKE from OWFs}},
  booktitle =	{16th Innovations in Theoretical Computer Science Conference (ITCS 2025)},
  pages =	{71:1--71:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-361-4},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{325},
  editor =	{Meka, Raghu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2025.71},
  URN =		{urn:nbn:de:0030-drops-226999},
  doi =		{10.4230/LIPIcs.ITCS.2025.71},
  annote =	{Keywords: Qautnum public-key encryption, Boolean function analysis}
}
Document
Online Mergers and Applications to Registration-Based Encryption and Accumulators

Authors: Mohammad Mahmoody and Wei Qi

Published in: LIPIcs, Volume 267, 4th Conference on Information-Theoretic Cryptography (ITC 2023)


Abstract
In this work we study a new information theoretic problem, called online merging, that has direct applications for constructing public-state accumulators and registration-based encryption schemes. An {online merger} receives the sequence of sets {1}, {2}, … in an online way, and right after receiving {i}, it can re-partition the elements 1,…,i into T₁,…,T_{m_i} by merging some of these sets. The goal of the merger is to balance the trade-off between the maximum number of sets wid = max_{i ∈ [n]} m_i that co-exist at any moment, called the width of the scheme, with its depth dep = max_{i ∈ [n]} d_i, where d_i is the number of times that the sets that contain i get merged. An online merger can be used to maintain a set of Merkle trees that occasionally get merged. An online merger can be directly used to obtain public-state accumulators (using collision-resistant hashing) and registration-based encryptions (relying on more assumptions). Doing so, the width of an online merger translates into the size of the public-parameter of the constructed scheme, and the depth of the online algorithm corresponds to the number of times that parties need to update their "witness" (for accumulators) or their decryption key (for RBE). In this work, we construct online mergers with poly(log n) width and O(log n / log log n) depth, which can be shown to be optimal for all schemes with poly(log n) width. More generally, we show how to achieve optimal depth for a given fixed width and to achieve a 2-approximate optimal width for a given depth d that can possibly grow as a function of n (e.g., d = 2 or d = log n / log log n). As applications, we obtain accumulators with O(log n / log log n) number of updates for parties' witnesses (which can be shown to be optimal for accumulator digests of length poly(log n)) as well as registration based encryptions that again have an optimal O(log n / log log n) number of decryption updates, resolving the open question of Mahmoody, Rahimi, Qi [TCC'22] who proved that Ω(log n / log log n) number of decryption updates are necessary for any RBE (with public parameter of length poly(log n)). More generally, for any given number of decryption updates d = d(n) (under believable computational assumptions) our online merger implies RBE schemes with public parameters of length that is optimal, up to a constant factor that depends on the security parameter. For example, for any constant number of updates d, we get RBE schemes with public parameters of length O(n^{1/(d+1)}).

Cite as

Mohammad Mahmoody and Wei Qi. Online Mergers and Applications to Registration-Based Encryption and Accumulators. In 4th Conference on Information-Theoretic Cryptography (ITC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 267, pp. 15:1-15:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{mahmoody_et_al:LIPIcs.ITC.2023.15,
  author =	{Mahmoody, Mohammad and Qi, Wei},
  title =	{{Online Mergers and Applications to Registration-Based Encryption and Accumulators}},
  booktitle =	{4th Conference on Information-Theoretic Cryptography (ITC 2023)},
  pages =	{15:1--15:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-271-6},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{267},
  editor =	{Chung, Kai-Min},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITC.2023.15},
  URN =		{urn:nbn:de:0030-drops-183432},
  doi =		{10.4230/LIPIcs.ITC.2023.15},
  annote =	{Keywords: Registration-based encryption, Accumulators, Merkle Trees}
}
Document
Black-Box Uselessness: Composing Separations in Cryptography

Authors: Geoffroy Couteau, Pooya Farshim, and Mohammad Mahmoody

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
Black-box separations have been successfully used to identify the limits of a powerful set of tools in cryptography, namely those of black-box reductions. They allow proving that a large set of techniques are not capable of basing one primitive 𝒫 on another 𝒬. Such separations, however, do not say anything about the power of the combination of primitives 𝒬₁,𝒬₂ for constructing 𝒫, even if 𝒫 cannot be based on 𝒬₁ or 𝒬₂ alone. By introducing and formalizing the notion of black-box uselessness, we develop a framework that allows us to make such conclusions. At an informal level, we call primitive 𝒬 black-box useless (BBU) for 𝒫 if 𝒬 cannot help constructing 𝒫 in a black-box way, even in the presence of another primitive 𝒵. This is formalized by saying that 𝒬 is BBU for 𝒫 if for any auxiliary primitive 𝒵, whenever there exists a black-box construction of 𝒫 from (𝒬,𝒵), then there must already also exist a black-box construction of 𝒫 from 𝒵 alone. We also formalize various other notions of black-box uselessness, and consider in particular the setting of efficient black-box constructions when the number of queries to 𝒬 is below a threshold. Impagliazzo and Rudich (STOC'89) initiated the study of black-box separations by separating key agreement from one-way functions. We prove a number of initial results in this direction, which indicate that one-way functions are perhaps also black-box useless for key agreement. In particular, we show that OWFs are black-box useless in any construction of key agreement in either of the following settings: (1) the key agreement has perfect correctness and one of the parties calls the OWF a constant number of times; (2) the key agreement consists of a single round of interaction (as in Merkle-type protocols). We conjecture that OWFs are indeed black-box useless for general key agreement. We also show that certain techniques for proving black-box separations can be lifted to the uselessness regime. In particular, we show that the lower bounds of Canetti, Kalai, and Paneth (TCC'15) as well as Garg, Mahmoody, and Mohammed (Crypto'17 & TCC'17) for assumptions behind indistinguishability obfuscation (IO) can be extended to derive black-box uselessness of a variety of primitives for obtaining (approximately correct) IO. These results follow the so-called "compiling out" technique, which we prove to imply black-box uselessness. Eventually, we study the complementary landscape of black-box uselessness, namely black-box helpfulness. We put forth the conjecture that one-way functions are black-box helpful for building collision-resistant hash functions. We define two natural relaxations of this conjecture, and prove that both of these conjectures are implied by a natural conjecture regarding random permutations equipped with a collision finder oracle, as defined by Simon (Eurocrypt'98). This conjecture may also be of interest in other contexts, such as amplification of hardness.

Cite as

Geoffroy Couteau, Pooya Farshim, and Mohammad Mahmoody. Black-Box Uselessness: Composing Separations in Cryptography. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 47:1-47:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{couteau_et_al:LIPIcs.ITCS.2021.47,
  author =	{Couteau, Geoffroy and Farshim, Pooya and Mahmoody, Mohammad},
  title =	{{Black-Box Uselessness: Composing Separations in Cryptography}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{47:1--47:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.47},
  URN =		{urn:nbn:de:0030-drops-135869},
  doi =		{10.4230/LIPIcs.ITCS.2021.47},
  annote =	{Keywords: Black-Box Reductions, Separations, One-Way Functions, Key Agreement}
}
Document
Track A: Algorithms, Complexity and Games
Can Verifiable Delay Functions Be Based on Random Oracles?

Authors: Mohammad Mahmoody, Caleb Smith, and David J. Wu

Published in: LIPIcs, Volume 168, 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)


Abstract
Boneh, Bonneau, Bünz, and Fisch (CRYPTO 2018) recently introduced the notion of a verifiable delay function (VDF). VDFs are functions that take a long sequential time T to compute, but whose outputs y := Eval(x) can be efficiently verified (possibly given a proof π) in time t ≪ T (e.g., t = poly(λ, log T) where λ is the security parameter). The first security requirement on a VDF, called uniqueness, is that no polynomial-time algorithm can find a convincing proof π' that verifies for an input x and a different output y' ≠ y. The second security requirement, called sequentiality, is that no polynomial-time algorithm running in time σ < T for some parameter σ (e.g., σ = T^{1/10}) can compute y, even with poly(T,λ) many parallel processors. Starting from the work of Boneh et al., there are now multiple constructions of VDFs from various algebraic assumptions. In this work, we study whether VDFs can be constructed from ideal hash functions in a black-box way, as modeled in the random oracle model (ROM). In the ROM, we measure the running time by the number of oracle queries and the sequentiality by the number of rounds of oracle queries. We rule out two classes of constructions of VDFs in the ROM: - We show that VDFs satisfying perfect uniqueness (i.e., VDFs where no different convincing solution y' ≠ y exists) cannot be constructed in the ROM. More formally, we give an attacker that finds the solution y in ≈ t rounds of queries, asking only poly(T) queries in total. - We also rule out tight verifiable delay functions in the ROM. Tight verifiable delay functions, recently studied by Döttling, Garg, Malavolta, and Vasudevan (ePrint Report 2019), require sequentiality for σ ≈ T-T^ρ for some constant 0 < ρ < 1. More generally, our lower bound also applies to proofs of sequential work (i.e., VDFs without the uniqueness property), even in the private verification setting, and sequentiality σ > T-(T)/(2t) for a concrete verification time t.

Cite as

Mohammad Mahmoody, Caleb Smith, and David J. Wu. Can Verifiable Delay Functions Be Based on Random Oracles?. In 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 168, pp. 83:1-83:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{mahmoody_et_al:LIPIcs.ICALP.2020.83,
  author =	{Mahmoody, Mohammad and Smith, Caleb and Wu, David J.},
  title =	{{Can Verifiable Delay Functions Be Based on Random Oracles?}},
  booktitle =	{47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)},
  pages =	{83:1--83:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-138-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{168},
  editor =	{Czumaj, Artur and Dawar, Anuj and Merelli, Emanuela},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2020.83},
  URN =		{urn:nbn:de:0030-drops-124907},
  doi =		{10.4230/LIPIcs.ICALP.2020.83},
  annote =	{Keywords: verifiable delay function, lower bound, random oracle model}
}
Document
Separating Two-Round Secure Computation From Oblivious Transfer

Authors: Benny Applebaum, Zvika Brakerski, Sanjam Garg, Yuval Ishai, and Akshayaram Srinivasan

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
We consider the question of minimizing the round complexity of protocols for secure multiparty computation (MPC) with security against an arbitrary number of semi-honest parties. Very recently, Garg and Srinivasan (Eurocrypt 2018) and Benhamouda and Lin (Eurocrypt 2018) constructed such 2-round MPC protocols from minimal assumptions. This was done by showing a round preserving reduction to the task of secure 2-party computation of the oblivious transfer functionality (OT). These constructions made a novel non-black-box use of the underlying OT protocol. The question remained whether this can be done by only making black-box use of 2-round OT. This is of theoretical and potentially also practical value as black-box use of primitives tends to lead to more efficient constructions. Our main result proves that such a black-box construction is impossible, namely that non-black-box use of OT is necessary. As a corollary, a similar separation holds when starting with any 2-party functionality other than OT. As a secondary contribution, we prove several additional results that further clarify the landscape of black-box MPC with minimal interaction. In particular, we complement the separation from 2-party functionalities by presenting a complete 4-party functionality, give evidence for the difficulty of ruling out a complete 3-party functionality and for the difficulty of ruling out black-box constructions of 3-round MPC from 2-round OT, and separate a relaxed "non-compact" variant of 2-party homomorphic secret sharing from 2-round OT.

Cite as

Benny Applebaum, Zvika Brakerski, Sanjam Garg, Yuval Ishai, and Akshayaram Srinivasan. Separating Two-Round Secure Computation From Oblivious Transfer. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 71:1-71:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{applebaum_et_al:LIPIcs.ITCS.2020.71,
  author =	{Applebaum, Benny and Brakerski, Zvika and Garg, Sanjam and Ishai, Yuval and Srinivasan, Akshayaram},
  title =	{{Separating Two-Round Secure Computation From Oblivious Transfer}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{71:1--71:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.71},
  URN =		{urn:nbn:de:0030-drops-117560},
  doi =		{10.4230/LIPIcs.ITCS.2020.71},
  annote =	{Keywords: Oracle Separation, Oblivious Transfer, Secure Multiparty Computation}
}
  • Refine by Type
  • 10 Document/PDF
  • 5 Document/HTML

  • Refine by Publication Year
  • 6 2025
  • 1 2023
  • 1 2021
  • 2 2020

  • Refine by Author
  • 4 Mahmoody, Mohammad
  • 1 Ananth, Prabhanjan
  • 1 Applebaum, Benny
  • 1 Beigi, Salman
  • 1 Brakerski, Zvika
  • Show More...

  • Refine by Series/Journal
  • 10 LIPIcs

  • Refine by Classification
  • 3 Theory of computation → Computational complexity and cryptography
  • 3 Theory of computation → Cryptographic primitives
  • 2 Theory of computation → Cryptographic protocols
  • 1 Computing methodologies → Machine learning
  • 1 Mathematics of computing → Approximation algorithms
  • Show More...

  • Refine by Keyword
  • 1 Accumulators
  • 1 Approximation Algorithms
  • 1 Black-Box Reductions
  • 1 Boolean function analysis
  • 1 Concentration bounds
  • Show More...

Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail