5 Search Results for "Bennett, Huck"


Document
RANDOM
Hardness of the (Approximate) Shortest Vector Problem: A Simple Proof via Reed-Solomon Codes

Authors: Huck Bennett and Chris Peikert

Published in: LIPIcs, Volume 275, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)


Abstract
We give a simple proof that the (approximate, decisional) Shortest Vector Problem is NP-hard under a randomized reduction. Specifically, we show that for any p ≥ 1 and any constant γ < 2^{1/p}, the γ-approximate problem in the 𝓁_p norm (γ-GapSVP_p) is not in RP unless NP ⊆ RP. Our proof follows an approach pioneered by Ajtai (STOC 1998), and strengthened by Micciancio (FOCS 1998 and SICOMP 2000), for showing hardness of γ-GapSVP_p using locally dense lattices. We construct such lattices simply by applying "Construction A" to Reed-Solomon codes with suitable parameters, and prove their local density via an elementary argument originally used in the context of Craig lattices. As in all known NP-hardness results for GapSVP_p with p < ∞, our reduction uses randomness. Indeed, it is a notorious open problem to prove NP-hardness via a deterministic reduction. To this end, we additionally discuss potential directions and associated challenges for derandomizing our reduction. In particular, we show that a close deterministic analogue of our local density construction would improve on the state-of-the-art explicit Reed-Solomon list-decoding lower bounds of Guruswami and Rudra (STOC 2005 and IEEE Transactions on Information Theory 2006). As a related contribution of independent interest, we also give a polynomial-time algorithm for decoding n-dimensional "Construction A Reed-Solomon lattices" (with different parameters than those used in our hardness proof) to a distance within an O(√log n) factor of Minkowski’s bound. This asymptotically matches the best known distance for decoding near Minkowski’s bound, due to Mook and Peikert (IEEE Transactions on Information Theory 2022), whose work we build on with a somewhat simpler construction and analysis.

Cite as

Huck Bennett and Chris Peikert. Hardness of the (Approximate) Shortest Vector Problem: A Simple Proof via Reed-Solomon Codes. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 275, pp. 37:1-37:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bennett_et_al:LIPIcs.APPROX/RANDOM.2023.37,
  author =	{Bennett, Huck and Peikert, Chris},
  title =	{{Hardness of the (Approximate) Shortest Vector Problem: A Simple Proof via Reed-Solomon Codes}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)},
  pages =	{37:1--37:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-296-9},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{275},
  editor =	{Megow, Nicole and Smith, Adam},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2023.37},
  URN =		{urn:nbn:de:0030-drops-188622},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2023.37},
  annote =	{Keywords: Lattices, Shortest Vector Problem, Reed-Solomon codes, NP-hardness, derandomization}
}
Document
Improved Hardness of BDD and SVP Under Gap-(S)ETH

Authors: Huck Bennett, Chris Peikert, and Yi Tang

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
We show improved fine-grained hardness of two key lattice problems in the 𝓁_p norm: Bounded Distance Decoding to within an α factor of the minimum distance (BDD_{p, α}) and the (decisional) γ-approximate Shortest Vector Problem (GapSVP_{p,γ}), assuming variants of the Gap (Strong) Exponential Time Hypothesis (Gap-(S)ETH). Specifically, we show: 1) For all p ∈ [1, ∞), there is no 2^{o(n)}-time algorithm for BDD_{p, α} for any constant α > α_kn, where α_kn = 2^{-c_kn} < 0.98491 and c_kn is the 𝓁₂ kissing-number constant, unless non-uniform Gap-ETH is false. 2) For all p ∈ [1, ∞), there is no 2^{o(n)}-time algorithm for BDD_{p, α} for any constant α > α^‡_p, where α^‡_p is explicit and satisfies α^‡_p = 1 for 1 ≤ p ≤ 2, α^‡_p < 1 for all p > 2, and α^‡_p → 1/2 as p → ∞, unless randomized Gap-ETH is false. 3) For all p ∈ [1, ∞) ⧵ 2 ℤ and all C > 1, there is no 2^{n/C}-time algorithm for BDD_{p, α} for any constant α > α^†_{p, C}, where α^†_{p, C} is explicit and satisfies α^†_{p, C} → 1 as C → ∞ for any fixed p ∈ [1, ∞), unless non-uniform Gap-SETH is false. 4) For all p > p₀ ≈ 2.1397, p ∉ 2ℤ, and all C > C_p, there is no 2^{n/C}-time algorithm for GapSVP_{p, γ} for some constant γ > 1, where C_p > 1 is explicit and satisfies C_p → 1 as p → ∞, unless randomized Gap-SETH is false. Our results for BDD_{p, α} improve and extend work by Aggarwal and Stephens-Davidowitz (STOC, 2018) and Bennett and Peikert (CCC, 2020). Specifically, the quantities α_kn and α^‡_p (respectively, α^†_{p,C}) significantly improve upon the corresponding quantity α_p^* (respectively, α_{p,C}^*) of Bennett and Peikert for small p (but arise from somewhat stronger assumptions). In particular, Item 1 improves the smallest value of α for which BDD_{p, α} is known to be exponentially hard in the Euclidean norm (p = 2) to an explicit constant α < 1 for the first time under a general-purpose complexity assumption. Items 1 and 3 crucially use the recent breakthrough result of Vlăduţ (Moscow Journal of Combinatorics and Number Theory, 2019), which showed an explicit exponential lower bound on the lattice kissing number. Finally, Item 4 answers a natural question left open by Aggarwal, Bennett, Golovnev, and Stephens-Davidowitz (SODA, 2021), which showed an analogous result for the Closest Vector Problem.

Cite as

Huck Bennett, Chris Peikert, and Yi Tang. Improved Hardness of BDD and SVP Under Gap-(S)ETH. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 19:1-19:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bennett_et_al:LIPIcs.ITCS.2022.19,
  author =	{Bennett, Huck and Peikert, Chris and Tang, Yi},
  title =	{{Improved Hardness of BDD and SVP Under Gap-(S)ETH}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{19:1--19:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.19},
  URN =		{urn:nbn:de:0030-drops-156151},
  doi =		{10.4230/LIPIcs.ITCS.2022.19},
  annote =	{Keywords: lattices, lattice-based cryptography, fine-grained complexity, Bounded Distance Decoding, Shortest Vector Problem}
}
Document
Hardness of Bounded Distance Decoding on Lattices in 𝓁_p Norms

Authors: Huck Bennett and Chris Peikert

Published in: LIPIcs, Volume 169, 35th Computational Complexity Conference (CCC 2020)


Abstract
Bounded Distance Decoding BDD_{p,α} is the problem of decoding a lattice when the target point is promised to be within an α factor of the minimum distance of the lattice, in the 𝓁_p norm. We prove that BDD_{p, α} is NP-hard under randomized reductions where α → 1/2 as p → ∞ (and for α = 1/2 when p = ∞), thereby showing the hardness of decoding for distances approaching the unique-decoding radius for large p. We also show fine-grained hardness for BDD_{p,α}. For example, we prove that for all p ∈ [1,∞) ⧵ 2ℤ and constants C > 1, ε > 0, there is no 2^((1-ε)n/C)-time algorithm for BDD_{p,α} for some constant α (which approaches 1/2 as p → ∞), assuming the randomized Strong Exponential Time Hypothesis (SETH). Moreover, essentially all of our results also hold (under analogous non-uniform assumptions) for BDD with preprocessing, in which unbounded precomputation can be applied to the lattice before the target is available. Compared to prior work on the hardness of BDD_{p,α} by Liu, Lyubashevsky, and Micciancio (APPROX-RANDOM 2008), our results improve the values of α for which the problem is known to be NP-hard for all p > p₁ ≈ 4.2773, and give the very first fine-grained hardness for BDD (in any norm). Our reductions rely on a special family of "locally dense" lattices in 𝓁_p norms, which we construct by modifying the integer-lattice sparsification technique of Aggarwal and Stephens-Davidowitz (STOC 2018).

Cite as

Huck Bennett and Chris Peikert. Hardness of Bounded Distance Decoding on Lattices in 𝓁_p Norms. In 35th Computational Complexity Conference (CCC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 169, pp. 36:1-36:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bennett_et_al:LIPIcs.CCC.2020.36,
  author =	{Bennett, Huck and Peikert, Chris},
  title =	{{Hardness of Bounded Distance Decoding on Lattices in 𝓁\underlinep Norms}},
  booktitle =	{35th Computational Complexity Conference (CCC 2020)},
  pages =	{36:1--36:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-156-6},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{169},
  editor =	{Saraf, Shubhangi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2020.36},
  URN =		{urn:nbn:de:0030-drops-125881},
  doi =		{10.4230/LIPIcs.CCC.2020.36},
  annote =	{Keywords: Lattices, Bounded Distance Decoding, NP-hardness, Fine-Grained Complexity}
}
Document
On Percolation and NP-Hardness

Authors: Huck Bennett, Daniel Reichman, and Igor Shinkar

Published in: LIPIcs, Volume 55, 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)


Abstract
The edge-percolation and vertex-percolation random graph models start with an arbitrary graph G, and randomly delete edges or vertices of G with some fixed probability. We study the computational hardness of problems whose inputs are obtained by applying percolation to worst-case instances. Specifically, we show that a number of classical N P-hard graph problems remain essentially as hard on percolated instances as they are in the worst-case (assuming NP !subseteq BPP). We also prove hardness results for other NP-hard problems such as Constraint Satisfaction Problems, where random deletions are applied to clauses or variables. We focus on proving the hardness of the Maximum Independent Set problem and the Graph Coloring problem on percolated instances. To show this we establish the robustness of the corresponding parameters alpha(.) and Chi(.) to percolation, which may be of independent interest. Given a graph G, let G' be the graph obtained by randomly deleting edges of G. We show that if alpha(G) is small, then alpha(G') remains small with probability at least 0.99. Similarly, we show that if Chi(G) is large, then Chi(G') remains large with probability at least 0.99.

Cite as

Huck Bennett, Daniel Reichman, and Igor Shinkar. On Percolation and NP-Hardness. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 80:1-80:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{bennett_et_al:LIPIcs.ICALP.2016.80,
  author =	{Bennett, Huck and Reichman, Daniel and Shinkar, Igor},
  title =	{{On Percolation and NP-Hardness}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{80:1--80:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Chatzigiannakis, Ioannis and Mitzenmacher, Michael and Rabani, Yuval and Sangiorgi, Davide},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2016.80},
  URN =		{urn:nbn:de:0030-drops-62056},
  doi =		{10.4230/LIPIcs.ICALP.2016.80},
  annote =	{Keywords: percolation, NP-hardness, random subgraphs, chromatic number}
}
Document
On the Lattice Distortion Problem

Authors: Huck Bennett, Daniel Dadush, and Noah Stephens-Davidowitz

Published in: LIPIcs, Volume 57, 24th Annual European Symposium on Algorithms (ESA 2016)


Abstract
We introduce and study the Lattice Distortion Problem (LDP). LDP asks how "similar" two lattices are. I.e., what is the minimal distortion of a linear bijection between the two lattices? LDP generalizes the Lattice Isomorphism Problem (the lattice analogue of Graph Isomorphism), which simply asks whether the minimal distortion is one. As our first contribution, we show that the distortion between any two lattices is approximated up to a n^{O(log(n))} factor by a simple function of their successive minima. Our methods are constructive, allowing us to compute low-distortion mappings that are within a 2^{O(n*log(log(n))/log(n))} factor of optimal in polynomial time and within a n^{O(log(n))} factor of optimal in singly exponential time. Our algorithms rely on a notion of basis reduction introduced by Seysen (Combinatorica 1993), which we show is intimately related to lattice distortion. Lastly, we show that LDP is NP-hard to approximate to within any constant factor (under randomized reductions), by a reduction from the Shortest Vector Problem.

Cite as

Huck Bennett, Daniel Dadush, and Noah Stephens-Davidowitz. On the Lattice Distortion Problem. In 24th Annual European Symposium on Algorithms (ESA 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 57, pp. 9:1-9:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{bennett_et_al:LIPIcs.ESA.2016.9,
  author =	{Bennett, Huck and Dadush, Daniel and Stephens-Davidowitz, Noah},
  title =	{{On the Lattice Distortion Problem}},
  booktitle =	{24th Annual European Symposium on Algorithms (ESA 2016)},
  pages =	{9:1--9:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-015-6},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{57},
  editor =	{Sankowski, Piotr and Zaroliagis, Christos},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2016.9},
  URN =		{urn:nbn:de:0030-drops-63519},
  doi =		{10.4230/LIPIcs.ESA.2016.9},
  annote =	{Keywords: lattices, lattice distortion, lattice isomoprhism, geometry of numbers, basis reduction}
}
  • Refine by Author
  • 5 Bennett, Huck
  • 3 Peikert, Chris
  • 1 Dadush, Daniel
  • 1 Reichman, Daniel
  • 1 Shinkar, Igor
  • Show More...

  • Refine by Classification
  • 2 Theory of computation → Problems, reductions and completeness
  • 1 Theory of computation → Computational complexity and cryptography
  • 1 Theory of computation → Computational geometry
  • 1 Theory of computation → Error-correcting codes
  • 1 Theory of computation → Pseudorandomness and derandomization

  • Refine by Keyword
  • 3 NP-hardness
  • 2 Bounded Distance Decoding
  • 2 Lattices
  • 2 Shortest Vector Problem
  • 2 lattices
  • Show More...

  • Refine by Type
  • 5 document

  • Refine by Publication Year
  • 2 2016
  • 1 2020
  • 1 2022
  • 1 2023

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail