24 Search Results for "Dinur, Irit"


Document
RANDOM
Fine Grained Analysis of High Dimensional Random Walks

Authors: Roy Gotlib and Tali Kaufman

Published in: LIPIcs, Volume 275, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)


Abstract
One of the most important properties of high dimensional expanders is that high dimensional random walks converge rapidly. This property has proven to be extremely useful in a variety of fields in the theory of computer science from agreement testing to sampling, coding theory and more. In this paper we present a state of the art result in a line of works analyzing the convergence of high dimensional random walks [Tali Kaufman and David Mass, 2017; Irit Dinur and Tali Kaufman, 2017; Tali Kaufman and Izhar Oppenheim, 2018; Vedat Levi Alev and Lap Chi Lau, 2020], by presenting a structured version of the result of [Vedat Levi Alev and Lap Chi Lau, 2020]. While previous works examined the expansion in the viewpoint of the worst possible eigenvalue, in this work we relate the expansion of a function to the entire spectrum of the random walk operator using the structure of the function; We call such a theorem a Fine Grained High Order Random Walk Theorem. In sufficiently structured cases the fine grained result that we present here can be much better than the worst case while in the worst case our result is equivalent to [Vedat Levi Alev and Lap Chi Lau, 2020]. In order to prove the Fine Grained High Order Random Walk Theorem we introduce a way to bootstrap the expansion of random walks on the vertices of a complex into a fine grained understanding of higher order random walks, provided that the expansion is good enough. In addition, our single bootstrapping theorem can simultaneously yield our Fine Grained High Order Random Walk Theorem as well as the well known Trickling down Theorem. Prior to this work, High order Random walks theorems and Tricking down Theorem have been obtained from different proof methods.

Cite as

Roy Gotlib and Tali Kaufman. Fine Grained Analysis of High Dimensional Random Walks. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 275, pp. 49:1-49:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{gotlib_et_al:LIPIcs.APPROX/RANDOM.2023.49,
  author =	{Gotlib, Roy and Kaufman, Tali},
  title =	{{Fine Grained Analysis of High Dimensional Random Walks}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)},
  pages =	{49:1--49:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-296-9},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{275},
  editor =	{Megow, Nicole and Smith, Adam},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2023.49},
  URN =		{urn:nbn:de:0030-drops-188740},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2023.49},
  annote =	{Keywords: High Dimensional Expanders, High Dimensional Random Walks, Local Spectral Expansion, Local to Global, Trickling Down}
}
Document
RANDOM
NP-Hardness of Almost Coloring Almost 3-Colorable Graphs

Authors: Yahli Hecht, Dor Minzer, and Muli Safra

Published in: LIPIcs, Volume 275, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)


Abstract
A graph G = (V,E) is said to be (k,δ) almost colorable if there is a subset of vertices V' ⊆ V of size at least (1-δ)|V| such that the induced subgraph of G on V' is k-colorable. We prove that for all k, there exists δ > 0 such for all ε > 0, given a graph G it is NP-hard (under randomized reductions) to distinguish between: 1) Yes case: G is (3,ε) almost colorable. 2) No case: G is not (k,δ) almost colorable. This improves upon an earlier result of Khot et al. [Irit Dinur et al., 2018], who showed a weaker result wherein in the "yes case" the graph is (4,ε) almost colorable.

Cite as

Yahli Hecht, Dor Minzer, and Muli Safra. NP-Hardness of Almost Coloring Almost 3-Colorable Graphs. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 275, pp. 51:1-51:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{hecht_et_al:LIPIcs.APPROX/RANDOM.2023.51,
  author =	{Hecht, Yahli and Minzer, Dor and Safra, Muli},
  title =	{{NP-Hardness of Almost Coloring Almost 3-Colorable Graphs}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2023)},
  pages =	{51:1--51:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-296-9},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{275},
  editor =	{Megow, Nicole and Smith, Adam},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2023.51},
  URN =		{urn:nbn:de:0030-drops-188761},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2023.51},
  annote =	{Keywords: PCP, Hardness of approximation}
}
Document
A New Conjecture on Hardness of 2-CSP’s with Implications to Hardness of Densest k-Subgraph and Other Problems

Authors: Julia Chuzhoy, Mina Dalirrooyfard, Vadim Grinberg, and Zihan Tan

Published in: LIPIcs, Volume 251, 14th Innovations in Theoretical Computer Science Conference (ITCS 2023)


Abstract
We propose a new conjecture on hardness of 2-CSP’s, and show that new hardness of approximation results for Densest k-Subgraph and several other problems, including a graph partitioning problem, and a variation of the Graph Crossing Number problem, follow from this conjecture. The conjecture can be viewed as occupying a middle ground between the d-to-1 conjecture, and hardness results for 2-CSP’s that can be obtained via standard techniques, such as Parallel Repetition combined with standard 2-prover protocols for the 3SAT problem. We hope that this work will motivate further exploration of hardness of 2-CSP’s in the regimes arising from the conjecture. We believe that a positive resolution of the conjecture will provide a good starting point for other hardness of approximation proofs. Another contribution of our work is proving that the problems that we consider are roughly equivalent from the approximation perspective. Some of these problems arose in previous work, from which it appeared that they may be related to each other. We formalize this relationship in this work.

Cite as

Julia Chuzhoy, Mina Dalirrooyfard, Vadim Grinberg, and Zihan Tan. A New Conjecture on Hardness of 2-CSP’s with Implications to Hardness of Densest k-Subgraph and Other Problems. In 14th Innovations in Theoretical Computer Science Conference (ITCS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 251, pp. 38:1-38:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{chuzhoy_et_al:LIPIcs.ITCS.2023.38,
  author =	{Chuzhoy, Julia and Dalirrooyfard, Mina and Grinberg, Vadim and Tan, Zihan},
  title =	{{A New Conjecture on Hardness of 2-CSP’s with Implications to Hardness of Densest k-Subgraph and Other Problems}},
  booktitle =	{14th Innovations in Theoretical Computer Science Conference (ITCS 2023)},
  pages =	{38:1--38:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-263-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{251},
  editor =	{Tauman Kalai, Yael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2023.38},
  URN =		{urn:nbn:de:0030-drops-175411},
  doi =		{10.4230/LIPIcs.ITCS.2023.38},
  annote =	{Keywords: Hardness of Approximation, Densest k-Subgraph}
}
Document
Invited Talk
Expanders in Higher Dimensions (Invited Talk)

Authors: Irit Dinur

Published in: LIPIcs, Volume 250, 42nd IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2022)


Abstract
Expander graphs have been studied in many areas of mathematics and in computer science with versatile applications, including coding theory, networking, computational complexity and geometry. High-dimensional expanders are a generalization that has been studied in recent years and their promise is beginning to bear fruit. In the talk, I will survey some powerful local to global properties of high-dimensional expanders, and describe several interesting applications, ranging from convergence of random walks to construction of locally testable codes that prove the c³ conjecture (namely, codes with constant rate, constant distance, and constant locality).

Cite as

Irit Dinur. Expanders in Higher Dimensions (Invited Talk). In 42nd IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 250, p. 4:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{dinur:LIPIcs.FSTTCS.2022.4,
  author =	{Dinur, Irit},
  title =	{{Expanders in Higher Dimensions}},
  booktitle =	{42nd IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2022)},
  pages =	{4:1--4:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-261-7},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{250},
  editor =	{Dawar, Anuj and Guruswami, Venkatesan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2022.4},
  URN =		{urn:nbn:de:0030-drops-173967},
  doi =		{10.4230/LIPIcs.FSTTCS.2022.4},
  annote =	{Keywords: Expanders}
}
Document
RANDOM
Improved Bounds for Coloring Locally Sparse Hypergraphs

Authors: Fotis Iliopoulos

Published in: LIPIcs, Volume 207, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2021)


Abstract
We show that, for every k ≥ 2, every k-uniform hypergaph of degree Δ and girth at least 5 is efficiently (1+o(1))(k-1) (Δ / ln Δ)^{1/(k-1)}-list colorable. As an application we obtain the currently best deterministic algorithm for list-coloring random hypergraphs of bounded average degree.

Cite as

Fotis Iliopoulos. Improved Bounds for Coloring Locally Sparse Hypergraphs. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 207, pp. 39:1-39:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{iliopoulos:LIPIcs.APPROX/RANDOM.2021.39,
  author =	{Iliopoulos, Fotis},
  title =	{{Improved Bounds for Coloring Locally Sparse Hypergraphs}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2021)},
  pages =	{39:1--39:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-207-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{207},
  editor =	{Wootters, Mary and Sanit\`{a}, Laura},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2021.39},
  URN =		{urn:nbn:de:0030-drops-147328},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2021.39},
  annote =	{Keywords: hypergaph coloring, semi-random method, locally sparse, random hypergraphs}
}
Document
On Rich 2-to-1 Games

Authors: Mark Braverman, Subhash Khot, and Dor Minzer

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
We propose a variant of the 2-to-1 Games Conjecture that we call the Rich 2-to-1 Games Conjecture and show that it is equivalent to the Unique Games Conjecture. We are motivated by two considerations. Firstly, in light of the recent proof of the 2-to-1 Games Conjecture [Subhash Khot et al., 2017; Irit Dinur et al., 2018; Irit Dinur et al., 2018; Subhash Khot et al., 2018], we hope to understand how one might make further progress towards a proof of the Unique Games Conjecture. Secondly, the new variant along with perfect completeness in addition, might imply hardness of approximation results that necessarily require perfect completeness and (hence) are not implied by the Unique Games Conjecture.

Cite as

Mark Braverman, Subhash Khot, and Dor Minzer. On Rich 2-to-1 Games. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 27:1-27:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{braverman_et_al:LIPIcs.ITCS.2021.27,
  author =	{Braverman, Mark and Khot, Subhash and Minzer, Dor},
  title =	{{On Rich 2-to-1 Games}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{27:1--27:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.27},
  URN =		{urn:nbn:de:0030-drops-135666},
  doi =		{10.4230/LIPIcs.ITCS.2021.27},
  annote =	{Keywords: PCP, Unique-Games, Perfect Completeness}
}
Document
Explicit SoS Lower Bounds from High-Dimensional Expanders

Authors: Irit Dinur, Yuval Filmus, Prahladh Harsha, and Madhur Tulsiani

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
We construct an explicit and structured family of 3XOR instances which is hard for O(√{log n}) levels of the Sum-of-Squares hierarchy. In contrast to earlier constructions, which involve a random component, our systems are highly structured and can be constructed explicitly in deterministic polynomial time. Our construction is based on the high-dimensional expanders devised by Lubotzky, Samuels and Vishne, known as LSV complexes or Ramanujan complexes, and our analysis is based on two notions of expansion for these complexes: cosystolic expansion, and a local isoperimetric inequality due to Gromov. Our construction offers an interesting contrast to the recent work of Alev, Jeronimo and the last author (FOCS 2019). They showed that 3XOR instances in which the variables correspond to vertices in a high-dimensional expander are easy to solve. In contrast, in our instances the variables correspond to the edges of the complex.

Cite as

Irit Dinur, Yuval Filmus, Prahladh Harsha, and Madhur Tulsiani. Explicit SoS Lower Bounds from High-Dimensional Expanders. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 38:1-38:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{dinur_et_al:LIPIcs.ITCS.2021.38,
  author =	{Dinur, Irit and Filmus, Yuval and Harsha, Prahladh and Tulsiani, Madhur},
  title =	{{Explicit SoS Lower Bounds from High-Dimensional Expanders}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{38:1--38:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.38},
  URN =		{urn:nbn:de:0030-drops-135774},
  doi =		{10.4230/LIPIcs.ITCS.2021.38},
  annote =	{Keywords: High-dimensional expanders, sum-of-squares, integrality gaps}
}
Document
APPROX
Revisiting Alphabet Reduction in Dinur’s PCP

Authors: Venkatesan Guruswami, Jakub Opršal, and Sai Sandeep

Published in: LIPIcs, Volume 176, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2020)


Abstract
Dinur’s celebrated proof of the PCP theorem alternates two main steps in several iterations: gap amplification to increase the soundness gap by a large constant factor (at the expense of much larger alphabet size), and a composition step that brings back the alphabet size to an absolute constant (at the expense of a fixed constant factor loss in the soundness gap). We note that the gap amplification can produce a Label Cover CSP. This allows us to reduce the alphabet size via a direct long-code based reduction from Label Cover to a Boolean CSP. Our composition step thus bypasses the concept of Assignment Testers from Dinur’s proof, and we believe it is more intuitive - it is just a gadget reduction. The analysis also uses only elementary facts (Parseval’s identity) about Fourier Transforms over the hypercube.

Cite as

Venkatesan Guruswami, Jakub Opršal, and Sai Sandeep. Revisiting Alphabet Reduction in Dinur’s PCP. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 176, pp. 34:1-34:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{guruswami_et_al:LIPIcs.APPROX/RANDOM.2020.34,
  author =	{Guruswami, Venkatesan and Opr\v{s}al, Jakub and Sandeep, Sai},
  title =	{{Revisiting Alphabet Reduction in Dinur’s PCP}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2020)},
  pages =	{34:1--34:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-164-1},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{176},
  editor =	{Byrka, Jaros{\l}aw and Meka, Raghu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX/RANDOM.2020.34},
  URN =		{urn:nbn:de:0030-drops-126372},
  doi =		{10.4230/LIPIcs.APPROX/RANDOM.2020.34},
  annote =	{Keywords: PCP theorem, CSP, discrete Fourier analysis, label cover, long code}
}
Document
Hardness Amplification of Optimization Problems

Authors: Elazar Goldenberg and Karthik C. S.

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
In this paper, we prove a general hardness amplification scheme for optimization problems based on the technique of direct products. We say that an optimization problem Π is direct product feasible if it is possible to efficiently aggregate any k instances of Π and form one large instance of Π such that given an optimal feasible solution to the larger instance, we can efficiently find optimal feasible solutions to all the k smaller instances. Given a direct product feasible optimization problem Π, our hardness amplification theorem may be informally stated as follows: If there is a distribution D over instances of Π of size n such that every randomized algorithm running in time t(n) fails to solve Π on 1/α(n) fraction of inputs sampled from D, then, assuming some relationships on α(n) and t(n), there is a distribution D' over instances of Π of size O(n⋅α(n)) such that every randomized algorithm running in time t(n)/poly(α(n)) fails to solve Π on 99/100 fraction of inputs sampled from D'. As a consequence of the above theorem, we show hardness amplification of problems in various classes such as NP-hard problems like Max-Clique, Knapsack, and Max-SAT, problems in P such as Longest Common Subsequence, Edit Distance, Matrix Multiplication, and even problems in TFNP such as Factoring and computing Nash equilibrium.

Cite as

Elazar Goldenberg and Karthik C. S.. Hardness Amplification of Optimization Problems. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 1:1-1:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{goldenberg_et_al:LIPIcs.ITCS.2020.1,
  author =	{Goldenberg, Elazar and Karthik C. S.},
  title =	{{Hardness Amplification of Optimization Problems}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{1:1--1:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.1},
  URN =		{urn:nbn:de:0030-drops-116863},
  doi =		{10.4230/LIPIcs.ITCS.2020.1},
  annote =	{Keywords: hardness amplification, average case complexity, direct product, optimization problems, fine-grained complexity, TFNP}
}
Document
Smooth and Strong PCPs

Authors: Orr Paradise

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
Probabilistically checkable proofs (PCPs) can be verified based only on a constant amount of random queries, such that any correct claim has a proof that is always accepted, and incorrect claims are rejected with high probability (regardless of the given alleged proof). We consider two possible features of PCPs: - A PCP is strong if it rejects an alleged proof of a correct claim with probability proportional to its distance from some correct proof of that claim. - A PCP is smooth if each location in a proof is queried with equal probability. We prove that all sets in NP have PCPs that are both smooth and strong, are of polynomial length, and can be verified based on a constant number of queries. This is achieved by following the proof of the PCP theorem of Arora, Lund, Motwani, Sudan and Szegedy (JACM, 1998), providing a stronger analysis of the Hadamard and Reed - Muller based PCPs and a refined PCP composition theorem. In fact, we show that any set in NP has a smooth strong canonical PCP of Proximity (PCPP), meaning that there is an efficiently computable bijection of NP witnesses to correct proofs. This improves on the recent construction of Dinur, Gur and Goldreich (ITCS, 2019) of PCPPs that are strong canonical but inherently non-smooth. Our result implies the hardness of approximating the satisfiability of "stable" 3CNF formulae with bounded variable occurrence, where stable means that the number of clauses violated by an assignment is proportional to its distance from a satisfying assignment (in the relative Hamming metric). This proves a hypothesis used in the work of Friggstad, Khodamoradi and Salavatipour (SODA, 2019), suggesting a connection between the hardness of these instances and other stable optimization problems.

Cite as

Orr Paradise. Smooth and Strong PCPs. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 2:1-2:41, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{paradise:LIPIcs.ITCS.2020.2,
  author =	{Paradise, Orr},
  title =	{{Smooth and Strong PCPs}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{2:1--2:41},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.2},
  URN =		{urn:nbn:de:0030-drops-116875},
  doi =		{10.4230/LIPIcs.ITCS.2020.2},
  annote =	{Keywords: Interactive and probabilistic proof systems, Probabilistically checkable proofs, Hardness of approximation}
}
Document
RANDOM
Direct Sum Testing: The General Case

Authors: Irit Dinur and Konstantin Golubev

Published in: LIPIcs, Volume 145, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2019)


Abstract
A function f:[n_1] x ... x[n_d]->F_2 is a direct sum if it is of the form f (a_1,...,a_d) = f_1(a_1) oplus ... oplus f_d (a_d), for some d functions f_i:[n_i]->F_2 for all i=1,..., d, and where n_1,...,n_d in N. We present a 4-query test which distinguishes between direct sums and functions that are far from them. The test relies on the BLR linearity test (Blum, Luby, Rubinfeld, 1993) and on the direct product test constructed by Dinur & Steurer (2014). We also present a different test, which queries the function (d+1) times, but is easier to analyze. In multiplicative +/- 1 notation, this reads as follows. A d-dimensional tensor with +/- 1 entries is called a tensor product if it is a tensor product of d vectors with +/- 1 entries, or equivalently, if it is of rank 1. The presented tests can be read as tests for distinguishing between tensor products and tensors that are far from being tensor products.

Cite as

Irit Dinur and Konstantin Golubev. Direct Sum Testing: The General Case. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 145, pp. 40:1-40:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{dinur_et_al:LIPIcs.APPROX-RANDOM.2019.40,
  author =	{Dinur, Irit and Golubev, Konstantin},
  title =	{{Direct Sum Testing: The General Case}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2019)},
  pages =	{40:1--40:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-125-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{145},
  editor =	{Achlioptas, Dimitris and V\'{e}gh, L\'{a}szl\'{o} A.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX-RANDOM.2019.40},
  URN =		{urn:nbn:de:0030-drops-112554},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2019.40},
  annote =	{Keywords: property testing, direct sum, tensor product}
}
Document
UG-Hardness to NP-Hardness by Losing Half

Authors: Amey Bhangale and Subhash Khot

Published in: LIPIcs, Volume 137, 34th Computational Complexity Conference (CCC 2019)


Abstract
The 2-to-2 Games Theorem of [Subhash Khot et al., 2017; Dinur et al., 2018; Dinur et al., 2018; Dinur et al., 2018] implies that it is NP-hard to distinguish between Unique Games instances with assignment satisfying at least (1/2-epsilon) fraction of the constraints vs. no assignment satisfying more than epsilon fraction of the constraints, for every constant epsilon>0. We show that the reduction can be transformed in a non-trivial way to give a stronger guarantee in the completeness case: For at least (1/2-epsilon) fraction of the vertices on one side, all the constraints associated with them in the Unique Games instance can be satisfied. We use this guarantee to convert the known UG-hardness results to NP-hardness. We show: 1) Tight inapproximability of approximating independent sets in degree d graphs within a factor of Omega(d/(log^2 d)), where d is a constant. 2) NP-hardness of approximate the Maximum Acyclic Subgraph problem within a factor of 2/3+epsilon, improving the previous ratio of 14/15+epsilon by Austrin et al. [Austrin et al., 2015]. 3) For any predicate P^{-1}(1) subseteq [q]^k supporting a balanced pairwise independent distribution, given a P-CSP instance with value at least 1/2-epsilon, it is NP-hard to satisfy more than (|P^{-1}(1)|/(q^k))+epsilon fraction of constraints.

Cite as

Amey Bhangale and Subhash Khot. UG-Hardness to NP-Hardness by Losing Half. In 34th Computational Complexity Conference (CCC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 137, pp. 3:1-3:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{bhangale_et_al:LIPIcs.CCC.2019.3,
  author =	{Bhangale, Amey and Khot, Subhash},
  title =	{{UG-Hardness to NP-Hardness by Losing Half}},
  booktitle =	{34th Computational Complexity Conference (CCC 2019)},
  pages =	{3:1--3:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-116-0},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{137},
  editor =	{Shpilka, Amir},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2019.3},
  URN =		{urn:nbn:de:0030-drops-108258},
  doi =		{10.4230/LIPIcs.CCC.2019.3},
  annote =	{Keywords: NP-hardness, Inapproximability, Unique Games Conjecture}
}
Document
From Local to Robust Testing via Agreement Testing

Authors: Irit Dinur, Prahladh Harsha, Tali Kaufman, and Noga Ron-Zewi

Published in: LIPIcs, Volume 124, 10th Innovations in Theoretical Computer Science Conference (ITCS 2019)


Abstract
A local tester for an error-correcting code is a probabilistic procedure that queries a small subset of coordinates, accepts codewords with probability one, and rejects non-codewords with probability proportional to their distance from the code. The local tester is robust if for non-codewords it satisfies the stronger property that the average distance of local views from accepting views is proportional to the distance from the code. Robust testing is an important component in constructions of locally testable codes and probabilistically checkable proofs as it allows for composition of local tests. In this work we show that for certain codes, any (natural) local tester can be converted to a roubst tester with roughly the same number of queries. Our result holds for the class of affine-invariant lifted codes which is a broad class of codes that includes Reed-Muller codes, as well as recent constructions of high-rate locally testable codes (Guo, Kopparty, and Sudan, ITCS 2013). Instantiating this with known local testing results for lifted codes gives a more direct proof that improves some of the parameters of the main result of Guo, Haramaty, and Sudan (FOCS 2015), showing robustness of lifted codes. To obtain the above transformation we relate the notions of local testing and robust testing to the notion of agreement testing that attempts to find out whether valid partial assignments can be stitched together to a global codeword. We first show that agreement testing implies robust testing, and then show that local testing implies agreement testing. Our proof is combinatorial, and is based on expansion / sampling properties of the collection of local views of local testers. Thus, it immediately applies to local testers of lifted codes that query random affine subspaces in F_q^m, and moreover seems amenable to extension to other families of locally testable codes with expanding families of local views.

Cite as

Irit Dinur, Prahladh Harsha, Tali Kaufman, and Noga Ron-Zewi. From Local to Robust Testing via Agreement Testing. In 10th Innovations in Theoretical Computer Science Conference (ITCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 124, pp. 29:1-29:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{dinur_et_al:LIPIcs.ITCS.2019.29,
  author =	{Dinur, Irit and Harsha, Prahladh and Kaufman, Tali and Ron-Zewi, Noga},
  title =	{{From Local to Robust Testing via Agreement Testing}},
  booktitle =	{10th Innovations in Theoretical Computer Science Conference (ITCS 2019)},
  pages =	{29:1--29:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-095-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{124},
  editor =	{Blum, Avrim},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2019.29},
  URN =		{urn:nbn:de:0030-drops-101221},
  doi =		{10.4230/LIPIcs.ITCS.2019.29},
  annote =	{Keywords: Local testing, Robust testing, Agreement testing, Affine-invariant codes, Lifted codes}
}
Document
Every Set in P Is Strongly Testable Under a Suitable Encoding

Authors: Irit Dinur, Oded Goldreich, and Tom Gur

Published in: LIPIcs, Volume 124, 10th Innovations in Theoretical Computer Science Conference (ITCS 2019)


Abstract
We show that every set in P is strongly testable under a suitable encoding. By "strongly testable" we mean having a (proximity oblivious) tester that makes a constant number of queries and rejects with probability that is proportional to the distance of the tested object from the property. By a "suitable encoding" we mean one that is polynomial-time computable and invertible. This result stands in contrast to the known fact that some sets in P are extremely hard to test, providing another demonstration of the crucial role of representation in the context of property testing. The testing result is proved by showing that any set in P has a strong canonical PCP, where canonical means that (for yes-instances) there exists a single proof that is accepted with probability 1 by the system, whereas all other potential proofs are rejected with probability proportional to their distance from this proof. In fact, we show that UP equals the class of sets having strong canonical PCPs (of logarithmic randomness), whereas the class of sets having strong canonical PCPs with polynomial proof length equals "unambiguous- MA". Actually, for the testing result, we use a PCP-of-Proximity version of the foregoing notion and an analogous positive result (i.e., strong canonical PCPPs of logarithmic randomness for any set in UP).

Cite as

Irit Dinur, Oded Goldreich, and Tom Gur. Every Set in P Is Strongly Testable Under a Suitable Encoding. In 10th Innovations in Theoretical Computer Science Conference (ITCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 124, pp. 30:1-30:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{dinur_et_al:LIPIcs.ITCS.2019.30,
  author =	{Dinur, Irit and Goldreich, Oded and Gur, Tom},
  title =	{{Every Set in P Is Strongly Testable Under a Suitable Encoding}},
  booktitle =	{10th Innovations in Theoretical Computer Science Conference (ITCS 2019)},
  pages =	{30:1--30:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-095-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{124},
  editor =	{Blum, Avrim},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2019.30},
  URN =		{urn:nbn:de:0030-drops-101234},
  doi =		{10.4230/LIPIcs.ITCS.2019.30},
  annote =	{Keywords: Probabilistically checkable proofs, property testing}
}
Document
On Polynomial Time Constructions of Minimum Height Decision Tree

Authors: Nader H. Bshouty and Waseem Makhoul

Published in: LIPIcs, Volume 123, 29th International Symposium on Algorithms and Computation (ISAAC 2018)


Abstract
A decision tree T in B_m:={0,1}^m is a binary tree where each of its internal nodes is labeled with an integer in [m]={1,2,...,m}, each leaf is labeled with an assignment a in B_m and each internal node has two outgoing edges that are labeled with 0 and 1, respectively. Let A subset {0,1}^m. We say that T is a decision tree for A if (1) For every a in A there is one leaf of T that is labeled with a. (2) For every path from the root to a leaf with internal nodes labeled with i_1,i_2,...,i_k in[m], a leaf labeled with a in A and edges labeled with xi_{i_1},...,xi_{i_k}in {0,1}, a is the only element in A that satisfies a_{i_j}=xi_{i_j} for all j=1,...,k. Our goal is to write a polynomial time (in n:=|A| and m) algorithm that for an input A subseteq B_m outputs a decision tree for A of minimum depth. This problem has many applications that include, to name a few, computer vision, group testing, exact learning from membership queries and game theory. Arkin et al. and Moshkov [Esther M. Arkin et al., 1998; Mikhail Ju. Moshkov, 2004] gave a polynomial time (ln |A|)- approximation algorithm (for the depth). The result of Dinur and Steurer [Irit Dinur and David Steurer, 2014] for set cover implies that this problem cannot be approximated with ratio (1-o(1))* ln |A|, unless P=NP. Moshkov studied in [Mikhail Ju. Moshkov, 2004; Mikhail Ju. Moshkov, 1982; Mikhail Ju. Moshkov, 1982] the combinatorial measure of extended teaching dimension of A, ETD(A). He showed that ETD(A) is a lower bound for the depth of the decision tree for A and then gave an exponential time ETD(A)/log(ETD(A))-approximation algorithm and a polynomial time 2(ln 2)ETD(A)-approximation algorithm. In this paper we further study the ETD(A) measure and a new combinatorial measure, DEN(A), that we call the density of the set A. We show that DEN(A) <=ETD(A)+1. We then give two results. The first result is that the lower bound ETD(A) of Moshkov for the depth of the decision tree for A is greater than the bounds that are obtained by the classical technique used in the literature. The second result is a polynomial time (ln 2)DEN(A)-approximation (and therefore (ln 2)ETD(A)-approximation) algorithm for the depth of the decision tree of A. We then apply the above results to learning the class of disjunctions of predicates from membership queries [Nader H. Bshouty et al., 2017]. We show that the ETD of this class is bounded from above by the degree d of its Hasse diagram. We then show that Moshkov algorithm can be run in polynomial time and is (d/log d)-approximation algorithm. This gives optimal algorithms when the degree is constant. For example, learning axis parallel rays over constant dimension space.

Cite as

Nader H. Bshouty and Waseem Makhoul. On Polynomial Time Constructions of Minimum Height Decision Tree. In 29th International Symposium on Algorithms and Computation (ISAAC 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 123, pp. 34:1-34:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{h.bshouty_et_al:LIPIcs.ISAAC.2018.34,
  author =	{H. Bshouty, Nader and Makhoul, Waseem},
  title =	{{On Polynomial Time Constructions of Minimum Height Decision Tree}},
  booktitle =	{29th International Symposium on Algorithms and Computation (ISAAC 2018)},
  pages =	{34:1--34:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-094-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{123},
  editor =	{Hsu, Wen-Lian and Lee, Der-Tsai and Liao, Chung-Shou},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2018.34},
  URN =		{urn:nbn:de:0030-drops-99824},
  doi =		{10.4230/LIPIcs.ISAAC.2018.34},
  annote =	{Keywords: Decision Tree, Minimal Depth, Approximation algorithms}
}
  • Refine by Author
  • 12 Dinur, Irit
  • 5 Harsha, Prahladh
  • 2 Bhangale, Amey
  • 2 C. S., Karthik
  • 2 Filmus, Yuval
  • Show More...

  • Refine by Classification
  • 3 Theory of computation → Expander graphs and randomness extractors
  • 3 Theory of computation → Problems, reductions and completeness
  • 2 Theory of computation → Approximation algorithms analysis
  • 2 Theory of computation → Computational complexity and cryptography
  • 2 Theory of computation → Probabilistic computation
  • Show More...

  • Refine by Keyword
  • 3 PCP
  • 2 Hardness of Approximation
  • 2 Hardness of approximation
  • 2 Inapproximability
  • 2 Parameterized Complexity
  • Show More...

  • Refine by Type
  • 24 document

  • Refine by Publication Year
  • 5 2018
  • 4 2019
  • 3 2017
  • 3 2020
  • 3 2021
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail