6 Search Results for "Peikert, Chris"


Document
RANDOM
Hardness of the (Approximate) Shortest Vector Problem: A Simple Proof via Reed-Solomon Codes

Authors: Huck Bennett and Chris Peikert

Published in: LIPIcs, Volume 275, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)


Abstract
We give a simple proof that the (approximate, decisional) Shortest Vector Problem is NP-hard under a randomized reduction. Specifically, we show that for any p ≥ 1 and any constant γ < 2^{1/p}, the γ-approximate problem in the 𝓁_p norm (γ-GapSVP_p) is not in RP unless NP ⊆ RP. Our proof follows an approach pioneered by Ajtai (STOC 1998), and strengthened by Micciancio (FOCS 1998 and SICOMP 2000), for showing hardness of γ-GapSVP_p using locally dense lattices. We construct such lattices simply by applying "Construction A" to Reed-Solomon codes with suitable parameters, and prove their local density via an elementary argument originally used in the context of Craig lattices. As in all known NP-hardness results for GapSVP_p with p < ∞, our reduction uses randomness. Indeed, it is a notorious open problem to prove NP-hardness via a deterministic reduction. To this end, we additionally discuss potential directions and associated challenges for derandomizing our reduction. In particular, we show that a close deterministic analogue of our local density construction would improve on the state-of-the-art explicit Reed-Solomon list-decoding lower bounds of Guruswami and Rudra (STOC 2005 and IEEE Transactions on Information Theory 2006). As a related contribution of independent interest, we also give a polynomial-time algorithm for decoding n-dimensional "Construction A Reed-Solomon lattices" (with different parameters than those used in our hardness proof) to a distance within an O(√log n) factor of Minkowski’s bound. This asymptotically matches the best known distance for decoding near Minkowski’s bound, due to Mook and Peikert (IEEE Transactions on Information Theory 2022), whose work we build on with a somewhat simpler construction and analysis.

Cite as

Huck Bennett and Chris Peikert. Hardness of the (Approximate) Shortest Vector Problem: A Simple Proof via Reed-Solomon Codes. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 275, pp. 37:1-37:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bennett_et_al:LIPIcs.APPROX/RANDOM.2023.37,
  author =	{Bennett, Huck and Peikert, Chris},
  title =	{{Hardness of the (Approximate) Shortest Vector Problem: A Simple Proof via Reed-Solomon Codes}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)},
  pages =	{37:1--37:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-296-9},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{275},
  editor =	{Megow, Nicole and Smith, Adam},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2023.37},
  URN =		{urn:nbn:de:0030-drops-188622},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2023.37},
  annote =	{Keywords: Lattices, Shortest Vector Problem, Reed-Solomon codes, NP-hardness, derandomization}
}
Document
Improved Hardness of BDD and SVP Under Gap-(S)ETH

Authors: Huck Bennett, Chris Peikert, and Yi Tang

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
We show improved fine-grained hardness of two key lattice problems in the 𝓁_p norm: Bounded Distance Decoding to within an α factor of the minimum distance (BDD_{p, α}) and the (decisional) γ-approximate Shortest Vector Problem (GapSVP_{p,γ}), assuming variants of the Gap (Strong) Exponential Time Hypothesis (Gap-(S)ETH). Specifically, we show: 1) For all p ∈ [1, ∞), there is no 2^{o(n)}-time algorithm for BDD_{p, α} for any constant α > α_kn, where α_kn = 2^{-c_kn} < 0.98491 and c_kn is the 𝓁₂ kissing-number constant, unless non-uniform Gap-ETH is false. 2) For all p ∈ [1, ∞), there is no 2^{o(n)}-time algorithm for BDD_{p, α} for any constant α > α^‡_p, where α^‡_p is explicit and satisfies α^‡_p = 1 for 1 ≤ p ≤ 2, α^‡_p < 1 for all p > 2, and α^‡_p → 1/2 as p → ∞, unless randomized Gap-ETH is false. 3) For all p ∈ [1, ∞) ⧵ 2 ℤ and all C > 1, there is no 2^{n/C}-time algorithm for BDD_{p, α} for any constant α > α^†_{p, C}, where α^†_{p, C} is explicit and satisfies α^†_{p, C} → 1 as C → ∞ for any fixed p ∈ [1, ∞), unless non-uniform Gap-SETH is false. 4) For all p > p₀ ≈ 2.1397, p ∉ 2ℤ, and all C > C_p, there is no 2^{n/C}-time algorithm for GapSVP_{p, γ} for some constant γ > 1, where C_p > 1 is explicit and satisfies C_p → 1 as p → ∞, unless randomized Gap-SETH is false. Our results for BDD_{p, α} improve and extend work by Aggarwal and Stephens-Davidowitz (STOC, 2018) and Bennett and Peikert (CCC, 2020). Specifically, the quantities α_kn and α^‡_p (respectively, α^†_{p,C}) significantly improve upon the corresponding quantity α_p^* (respectively, α_{p,C}^*) of Bennett and Peikert for small p (but arise from somewhat stronger assumptions). In particular, Item 1 improves the smallest value of α for which BDD_{p, α} is known to be exponentially hard in the Euclidean norm (p = 2) to an explicit constant α < 1 for the first time under a general-purpose complexity assumption. Items 1 and 3 crucially use the recent breakthrough result of Vlăduţ (Moscow Journal of Combinatorics and Number Theory, 2019), which showed an explicit exponential lower bound on the lattice kissing number. Finally, Item 4 answers a natural question left open by Aggarwal, Bennett, Golovnev, and Stephens-Davidowitz (SODA, 2021), which showed an analogous result for the Closest Vector Problem.

Cite as

Huck Bennett, Chris Peikert, and Yi Tang. Improved Hardness of BDD and SVP Under Gap-(S)ETH. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 19:1-19:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bennett_et_al:LIPIcs.ITCS.2022.19,
  author =	{Bennett, Huck and Peikert, Chris and Tang, Yi},
  title =	{{Improved Hardness of BDD and SVP Under Gap-(S)ETH}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{19:1--19:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.19},
  URN =		{urn:nbn:de:0030-drops-156151},
  doi =		{10.4230/LIPIcs.ITCS.2022.19},
  annote =	{Keywords: lattices, lattice-based cryptography, fine-grained complexity, Bounded Distance Decoding, Shortest Vector Problem}
}
Document
Correlation-Intractable Hash Functions via Shift-Hiding

Authors: Alex Lombardi and Vinod Vaikuntanathan

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
A hash function family ℋ is correlation intractable for a t-input relation ℛ if, given a random function h chosen from ℋ, it is hard to find x_1,…,x_t such that ℛ(x_1,…,x_t,h(x₁),…,h(x_t)) is true. Among other applications, such hash functions are a crucial tool for instantiating the Fiat-Shamir heuristic in the plain model, including the only known NIZK for NP based on the learning with errors (LWE) problem (Peikert and Shiehian, CRYPTO 2019). We give a conceptually simple and generic construction of single-input CI hash functions from shift-hiding shiftable functions (Peikert and Shiehian, PKC 2018) satisfying an additional one-wayness property. This results in a clean abstract framework for instantiating CI, and also shows that a previously existing function family (PKC 2018) was already CI under the LWE assumption. In addition, our framework transparently generalizes to other settings, yielding new results: - We show how to instantiate certain forms of multi-input CI under the LWE assumption. Prior constructions either relied on a very strong "brute-force-is-best" type of hardness assumption (Holmgren and Lombardi, FOCS 2018) or were restricted to "output-only" relations (Zhandry, CRYPTO 2016). - We construct single-input CI hash functions from indistinguishability obfuscation (iO) and one-way permutations. Prior constructions relied essentially on variants of fully homomorphic encryption that are impossible to construct from such primitives. This result also generalizes to more expressive variants of multi-input CI under iO and additional standard assumptions.

Cite as

Alex Lombardi and Vinod Vaikuntanathan. Correlation-Intractable Hash Functions via Shift-Hiding. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 102:1-102:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{lombardi_et_al:LIPIcs.ITCS.2022.102,
  author =	{Lombardi, Alex and Vaikuntanathan, Vinod},
  title =	{{Correlation-Intractable Hash Functions via Shift-Hiding}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{102:1--102:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.102},
  URN =		{urn:nbn:de:0030-drops-156981},
  doi =		{10.4230/LIPIcs.ITCS.2022.102},
  annote =	{Keywords: Cryptographic hash functions, correlation intractability}
}
Document
Hardness of Bounded Distance Decoding on Lattices in 𝓁_p Norms

Authors: Huck Bennett and Chris Peikert

Published in: LIPIcs, Volume 169, 35th Computational Complexity Conference (CCC 2020)


Abstract
Bounded Distance Decoding BDD_{p,α} is the problem of decoding a lattice when the target point is promised to be within an α factor of the minimum distance of the lattice, in the 𝓁_p norm. We prove that BDD_{p, α} is NP-hard under randomized reductions where α → 1/2 as p → ∞ (and for α = 1/2 when p = ∞), thereby showing the hardness of decoding for distances approaching the unique-decoding radius for large p. We also show fine-grained hardness for BDD_{p,α}. For example, we prove that for all p ∈ [1,∞) ⧵ 2ℤ and constants C > 1, ε > 0, there is no 2^((1-ε)n/C)-time algorithm for BDD_{p,α} for some constant α (which approaches 1/2 as p → ∞), assuming the randomized Strong Exponential Time Hypothesis (SETH). Moreover, essentially all of our results also hold (under analogous non-uniform assumptions) for BDD with preprocessing, in which unbounded precomputation can be applied to the lattice before the target is available. Compared to prior work on the hardness of BDD_{p,α} by Liu, Lyubashevsky, and Micciancio (APPROX-RANDOM 2008), our results improve the values of α for which the problem is known to be NP-hard for all p > p₁ ≈ 4.2773, and give the very first fine-grained hardness for BDD (in any norm). Our reductions rely on a special family of "locally dense" lattices in 𝓁_p norms, which we construct by modifying the integer-lattice sparsification technique of Aggarwal and Stephens-Davidowitz (STOC 2018).

Cite as

Huck Bennett and Chris Peikert. Hardness of Bounded Distance Decoding on Lattices in 𝓁_p Norms. In 35th Computational Complexity Conference (CCC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 169, pp. 36:1-36:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bennett_et_al:LIPIcs.CCC.2020.36,
  author =	{Bennett, Huck and Peikert, Chris},
  title =	{{Hardness of Bounded Distance Decoding on Lattices in 𝓁\underlinep Norms}},
  booktitle =	{35th Computational Complexity Conference (CCC 2020)},
  pages =	{36:1--36:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-156-6},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{169},
  editor =	{Saraf, Shubhangi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2020.36},
  URN =		{urn:nbn:de:0030-drops-125881},
  doi =		{10.4230/LIPIcs.CCC.2020.36},
  annote =	{Keywords: Lattices, Bounded Distance Decoding, NP-hardness, Fine-Grained Complexity}
}
Document
Public-Key Cryptosystems from the Worst-Case Shortest Vector Problem

Authors: Chris Peikert

Published in: Dagstuhl Seminar Proceedings, Volume 8491, Theoretical Foundations of Practical Information Security (2009)


Abstract
We construct public-key cryptosystems that are secure assuming the *worst-case* hardness of approximating the shortest vector problem on lattices. Prior cryptosystems with worst-case connections (e.g., the Ajtai-Dwork system) were based either on a *special case* of the shortest vector problem, or on the conjectured hardness of lattice problems for *quantum* algorithms. Our main technical innovation is a reduction from certain variants of the shortest vector problem to corresponding versions of the "learning with errors" (LWE) problem; previously, only a quantum reduction of this kind was known. In addition, we construct new cryptosystems based on LWE, including a very natural chosen ciphertext-secure system that has a much simpler description and tighter underlying worst-case approximation factor than prior constructions. (Duration: 30 minutes, on or before Wednesday.)

Cite as

Chris Peikert. Public-Key Cryptosystems from the Worst-Case Shortest Vector Problem. In Theoretical Foundations of Practical Information Security. Dagstuhl Seminar Proceedings, Volume 8491, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{peikert:DagSemProc.08491.4,
  author =	{Peikert, Chris},
  title =	{{Public-Key Cryptosystems from the Worst-Case Shortest Vector Problem}},
  booktitle =	{Theoretical Foundations of Practical Information Security},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2009},
  volume =	{8491},
  editor =	{Ran Canetti and Shafi Goldwasser and G\"{u}nter M\"{u}ller and Rainer Steinwandt},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08491.4},
  URN =		{urn:nbn:de:0030-drops-18922},
  doi =		{10.4230/DagSemProc.08491.4},
  annote =	{Keywords: Lattice-based cryptography, learning with errors, quantum computation}
}
Document
Generating Shorter Bases for Hard Random Lattices

Authors: Joel Alwen and Chris Peikert

Published in: LIPIcs, Volume 3, 26th International Symposium on Theoretical Aspects of Computer Science (2009)


Abstract
We revisit the problem of generating a ``hard'' random lattice together with a basis of relatively short vectors. This problem has gained in importance lately due to new cryptographic schemes that use such a procedure for generating public/secret key pairs. In these applications, a shorter basis directly corresponds to milder underlying complexity assumptions and smaller key sizes. The contributions of this work are twofold. First, using the \emph{Hermite normal form} as an organizing principle, we simplify and generalize an approach due to Ajtai (ICALP 1999). Second, we improve the construction and its analysis in several ways, most notably by tightening the length of the output basis essentially to the optimum value.

Cite as

Joel Alwen and Chris Peikert. Generating Shorter Bases for Hard Random Lattices. In 26th International Symposium on Theoretical Aspects of Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 3, pp. 75-86, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{alwen_et_al:LIPIcs.STACS.2009.1832,
  author =	{Alwen, Joel and Peikert, Chris},
  title =	{{Generating Shorter Bases for Hard Random Lattices}},
  booktitle =	{26th International Symposium on Theoretical Aspects of Computer Science},
  pages =	{75--86},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-09-5},
  ISSN =	{1868-8969},
  year =	{2009},
  volume =	{3},
  editor =	{Albers, Susanne and Marion, Jean-Yves},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2009.1832},
  URN =		{urn:nbn:de:0030-drops-18327},
  doi =		{10.4230/LIPIcs.STACS.2009.1832},
  annote =	{Keywords: Lattices, Random, Short basis, Average-case hardness, Hermite normal form, Cryptography}
}
  • Refine by Author
  • 5 Peikert, Chris
  • 3 Bennett, Huck
  • 1 Alwen, Joel
  • 1 Lombardi, Alex
  • 1 Tang, Yi
  • Show More...

  • Refine by Classification
  • 2 Theory of computation → Problems, reductions and completeness
  • 1 Theory of computation → Computational complexity and cryptography
  • 1 Theory of computation → Computational geometry
  • 1 Theory of computation → Cryptographic primitives
  • 1 Theory of computation → Error-correcting codes
  • Show More...

  • Refine by Keyword
  • 3 Lattices
  • 2 Bounded Distance Decoding
  • 2 NP-hardness
  • 2 Shortest Vector Problem
  • 1 Average-case hardness
  • Show More...

  • Refine by Type
  • 6 document

  • Refine by Publication Year
  • 2 2009
  • 2 2022
  • 1 2020
  • 1 2023

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail