12 Search Results for "Gur, Tom"


Document
Track A: Algorithms, Complexity and Games
Relaxed Locally Correctable Codes with Improved Parameters

Authors: Vahid R. Asadi and Igor Shinkar

Published in: LIPIcs, Volume 198, 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)


Abstract
Locally decodable codes (LDCs) are error-correcting codes C: Σ^k → Σⁿ that admit a local decoding algorithm that recovers each individual bit of the message by querying only a few bits from a noisy codeword. An important question in this line of research is to understand the optimal trade-off between the query complexity of LDCs and their block length. Despite importance of these objects, the best known constructions of constant query LDCs have super-polynomial length, and there is a significant gap between the best constructions and the known lower bounds in terms of the block length. For many applications it suffices to consider the weaker notion of relaxed LDCs (RLDCs), which allows the local decoding algorithm to abort if by querying a few bits it detects that the input is not a codeword. This relaxation turned out to allow decoding algorithms with constant query complexity for codes with almost linear length. Specifically, [{Ben-Sasson} et al., 2006] constructed a q-query RLDC that encodes a message of length k using a codeword of block length n = O_q(k^{1+O(1/√q)}) for any sufficiently large q, where O_q(⋅) hides some constant that depends only on q. In this work we improve the parameters of [{Ben-Sasson} et al., 2006] by constructing a q-query RLDC that encodes a message of length k using a codeword of block length O_q(k^{1+O(1/{q})}) for any sufficiently large q. This construction matches (up to a multiplicative constant factor) the lower bounds of [Jonathan Katz and Trevisan, 2000; Woodruff, 2007] for constant query LDCs, thus making progress toward understanding the gap between LDCs and RLDCs in the constant query regime. In fact, our construction extends to the stronger notion of relaxed locally correctable codes (RLCCs), introduced in [Tom Gur et al., 2018], where given a noisy codeword the correcting algorithm either recovers each individual bit of the codeword by only reading a small part of the input, or aborts if the input is detected to be corrupt.

Cite as

Vahid R. Asadi and Igor Shinkar. Relaxed Locally Correctable Codes with Improved Parameters. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 18:1-18:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{asadi_et_al:LIPIcs.ICALP.2021.18,
  author =	{Asadi, Vahid R. and Shinkar, Igor},
  title =	{{Relaxed Locally Correctable Codes with Improved Parameters}},
  booktitle =	{48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)},
  pages =	{18:1--18:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-195-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{198},
  editor =	{Bansal, Nikhil and Merelli, Emanuela and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2021.18},
  URN =		{urn:nbn:de:0030-drops-140878},
  doi =		{10.4230/LIPIcs.ICALP.2021.18},
  annote =	{Keywords: Algorithmic coding theory, consistency test using random walk, Reed-Muller code, relaxed locally decodable codes, relaxed locally correctable codes}
}
Document
Smooth and Strong PCPs

Authors: Orr Paradise

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
Probabilistically checkable proofs (PCPs) can be verified based only on a constant amount of random queries, such that any correct claim has a proof that is always accepted, and incorrect claims are rejected with high probability (regardless of the given alleged proof). We consider two possible features of PCPs: - A PCP is strong if it rejects an alleged proof of a correct claim with probability proportional to its distance from some correct proof of that claim. - A PCP is smooth if each location in a proof is queried with equal probability. We prove that all sets in NP have PCPs that are both smooth and strong, are of polynomial length, and can be verified based on a constant number of queries. This is achieved by following the proof of the PCP theorem of Arora, Lund, Motwani, Sudan and Szegedy (JACM, 1998), providing a stronger analysis of the Hadamard and Reed - Muller based PCPs and a refined PCP composition theorem. In fact, we show that any set in NP has a smooth strong canonical PCP of Proximity (PCPP), meaning that there is an efficiently computable bijection of NP witnesses to correct proofs. This improves on the recent construction of Dinur, Gur and Goldreich (ITCS, 2019) of PCPPs that are strong canonical but inherently non-smooth. Our result implies the hardness of approximating the satisfiability of "stable" 3CNF formulae with bounded variable occurrence, where stable means that the number of clauses violated by an assignment is proportional to its distance from a satisfying assignment (in the relative Hamming metric). This proves a hypothesis used in the work of Friggstad, Khodamoradi and Salavatipour (SODA, 2019), suggesting a connection between the hardness of these instances and other stable optimization problems.

Cite as

Orr Paradise. Smooth and Strong PCPs. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 2:1-2:41, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{paradise:LIPIcs.ITCS.2020.2,
  author =	{Paradise, Orr},
  title =	{{Smooth and Strong PCPs}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{2:1--2:41},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.2},
  URN =		{urn:nbn:de:0030-drops-116875},
  doi =		{10.4230/LIPIcs.ITCS.2020.2},
  annote =	{Keywords: Interactive and probabilistic proof systems, Probabilistically checkable proofs, Hardness of approximation}
}
Document
High-Dimensional Expanders from Expanders

Authors: Siqi Liu, Sidhanth Mohanty, and Elizabeth Yang

Published in: LIPIcs, Volume 151, 11th Innovations in Theoretical Computer Science Conference (ITCS 2020)


Abstract
We present an elementary way to transform an expander graph into a simplicial complex where all high order random walks have a constant spectral gap, i.e., they converge rapidly to the stationary distribution. As an upshot, we obtain new constructions, as well as a natural probabilistic model to sample constant degree high-dimensional expanders. In particular, we show that given an expander graph G, adding self loops to G and taking the tensor product of the modified graph with a high-dimensional expander produces a new high-dimensional expander. Our proof of rapid mixing of high order random walks is based on the decomposable Markov chains framework introduced by [Jerrum et al., 2004].

Cite as

Siqi Liu, Sidhanth Mohanty, and Elizabeth Yang. High-Dimensional Expanders from Expanders. In 11th Innovations in Theoretical Computer Science Conference (ITCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 151, pp. 12:1-12:32, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{liu_et_al:LIPIcs.ITCS.2020.12,
  author =	{Liu, Siqi and Mohanty, Sidhanth and Yang, Elizabeth},
  title =	{{High-Dimensional Expanders from Expanders}},
  booktitle =	{11th Innovations in Theoretical Computer Science Conference (ITCS 2020)},
  pages =	{12:1--12:32},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-134-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{151},
  editor =	{Vidick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2020.12},
  URN =		{urn:nbn:de:0030-drops-116974},
  doi =		{10.4230/LIPIcs.ITCS.2020.12},
  annote =	{Keywords: High-Dimensional Expanders, Markov Chains}
}
Document
Non-Malleable Extractors and Non-Malleable Codes: Partially Optimal Constructions

Authors: Xin Li

Published in: LIPIcs, Volume 137, 34th Computational Complexity Conference (CCC 2019)


Abstract
The recent line of study on randomness extractors has been a great success, resulting in exciting new techniques, new connections, and breakthroughs to long standing open problems in several seemingly different topics. These include seeded non-malleable extractors, privacy amplification protocols with an active adversary, independent source extractors (and explicit Ramsey graphs), and non-malleable codes in the split state model. Previously, the best constructions are given in [Xin Li, 2017]: seeded non-malleable extractors with seed length and entropy requirement O(log n+log(1/epsilon)log log (1/epsilon)) for error epsilon; two-round privacy amplification protocols with optimal entropy loss for security parameter up to Omega(k/log k), where k is the entropy of the shared weak source; two-source extractors for entropy O(log n log log n); and non-malleable codes in the 2-split state model with rate Omega(1/log n). However, in all cases there is still a gap to optimum and the motivation to close this gap remains strong. In this paper, we introduce a set of new techniques to further push the frontier in the above questions. Our techniques lead to improvements in all of the above questions, and in several cases partially optimal constructions. This is in contrast to all previous work, which only obtain close to optimal constructions. Specifically, we obtain: 1) A seeded non-malleable extractor with seed length O(log n)+log^{1+o(1)}(1/epsilon) and entropy requirement O(log log n+log(1/epsilon)), where the entropy requirement is asymptotically optimal by a recent result of Gur and Shinkar [Tom Gur and Igor Shinkar, 2018]; 2) A two-round privacy amplification protocol with optimal entropy loss for security parameter up to Omega(k), which solves the privacy amplification problem completely; 3) A two-source extractor for entropy O((log n log log n)/(log log log n)), which also gives an explicit Ramsey graph on N vertices with no clique or independent set of size (log N)^{O((log log log N)/(log log log log N))}; and 4) The first explicit non-malleable code in the 2-split state model with constant rate, which has been a major goal in the study of non-malleable codes for quite some time. One small caveat is that the error of this code is only (an arbitrarily small) constant, but we can also achieve negligible error with rate Omega(log log log n/log log n), which already improves the rate in [Xin Li, 2017] exponentially. We believe our new techniques can help to eventually obtain completely optimal constructions in the above questions, and may have applications in other settings.

Cite as

Xin Li. Non-Malleable Extractors and Non-Malleable Codes: Partially Optimal Constructions. In 34th Computational Complexity Conference (CCC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 137, pp. 28:1-28:49, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{li:LIPIcs.CCC.2019.28,
  author =	{Li, Xin},
  title =	{{Non-Malleable Extractors and Non-Malleable Codes: Partially Optimal Constructions}},
  booktitle =	{34th Computational Complexity Conference (CCC 2019)},
  pages =	{28:1--28:49},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-116-0},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{137},
  editor =	{Shpilka, Amir},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2019.28},
  URN =		{urn:nbn:de:0030-drops-108507},
  doi =		{10.4230/LIPIcs.CCC.2019.28},
  annote =	{Keywords: extractor, non-malleable, privacy, codes}
}
Document
Every Set in P Is Strongly Testable Under a Suitable Encoding

Authors: Irit Dinur, Oded Goldreich, and Tom Gur

Published in: LIPIcs, Volume 124, 10th Innovations in Theoretical Computer Science Conference (ITCS 2019)


Abstract
We show that every set in P is strongly testable under a suitable encoding. By "strongly testable" we mean having a (proximity oblivious) tester that makes a constant number of queries and rejects with probability that is proportional to the distance of the tested object from the property. By a "suitable encoding" we mean one that is polynomial-time computable and invertible. This result stands in contrast to the known fact that some sets in P are extremely hard to test, providing another demonstration of the crucial role of representation in the context of property testing. The testing result is proved by showing that any set in P has a strong canonical PCP, where canonical means that (for yes-instances) there exists a single proof that is accepted with probability 1 by the system, whereas all other potential proofs are rejected with probability proportional to their distance from this proof. In fact, we show that UP equals the class of sets having strong canonical PCPs (of logarithmic randomness), whereas the class of sets having strong canonical PCPs with polynomial proof length equals "unambiguous- MA". Actually, for the testing result, we use a PCP-of-Proximity version of the foregoing notion and an analogous positive result (i.e., strong canonical PCPPs of logarithmic randomness for any set in UP).

Cite as

Irit Dinur, Oded Goldreich, and Tom Gur. Every Set in P Is Strongly Testable Under a Suitable Encoding. In 10th Innovations in Theoretical Computer Science Conference (ITCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 124, pp. 30:1-30:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{dinur_et_al:LIPIcs.ITCS.2019.30,
  author =	{Dinur, Irit and Goldreich, Oded and Gur, Tom},
  title =	{{Every Set in P Is Strongly Testable Under a Suitable Encoding}},
  booktitle =	{10th Innovations in Theoretical Computer Science Conference (ITCS 2019)},
  pages =	{30:1--30:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-095-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{124},
  editor =	{Blum, Avrim},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2019.30},
  URN =		{urn:nbn:de:0030-drops-101234},
  doi =		{10.4230/LIPIcs.ITCS.2019.30},
  annote =	{Keywords: Probabilistically checkable proofs, property testing}
}
Document
An Exponential Separation Between MA and AM Proofs of Proximity

Authors: Tom Gur, Yang P. Liu, and Ron D. Rothblum

Published in: LIPIcs, Volume 107, 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)


Abstract
Interactive proofs of proximity allow a sublinear-time verifier to check that a given input is close to the language, using a small amount of communication with a powerful (but untrusted) prover. In this work we consider two natural minimally interactive variants of such proofs systems, in which the prover only sends a single message, referred to as the proof. The first variant, known as MA-proofs of Proximity (MAP), is fully non-interactive, meaning that the proof is a function of the input only. The second variant, known as AM-proofs of Proximity (AMP), allows the proof to additionally depend on the verifier's (entire) random string. The complexity of both MAPs and AMPs is the total number of bits that the verifier observes - namely, the sum of the proof length and query complexity. Our main result is an exponential separation between the power of MAPs and AMPs. Specifically, we exhibit an explicit and natural property Pi that admits an AMP with complexity O(log n), whereas any MAP for Pi has complexity Omega~(n^{1/4}), where n denotes the length of the input in bits. Our MAP lower bound also yields an alternate proof, which is more general and arguably much simpler, for a recent result of Fischer et al. (ITCS, 2014). Lastly, we also consider the notion of oblivious proofs of proximity, in which the verifier's queries are oblivious to the proof. In this setting we show that AMPs can only be quadratically stronger than MAPs. As an application of this result, we show an exponential separation between the power of public and private coin for oblivious interactive proofs of proximity.

Cite as

Tom Gur, Yang P. Liu, and Ron D. Rothblum. An Exponential Separation Between MA and AM Proofs of Proximity. In 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 107, pp. 73:1-73:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{gur_et_al:LIPIcs.ICALP.2018.73,
  author =	{Gur, Tom and Liu, Yang P. and Rothblum, Ron D.},
  title =	{{An Exponential Separation Between MA and AM Proofs of Proximity}},
  booktitle =	{45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)},
  pages =	{73:1--73:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-076-7},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{107},
  editor =	{Chatzigiannakis, Ioannis and Kaklamanis, Christos and Marx, D\'{a}niel and Sannella, Donald},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2018.73},
  URN =		{urn:nbn:de:0030-drops-90772},
  doi =		{10.4230/LIPIcs.ICALP.2018.73},
  annote =	{Keywords: Property testing, Probabilistic proof systems, Proofs of proximity}
}
Document
Relaxed Locally Correctable Codes

Authors: Tom Gur, Govind Ramnarayan, and Ron D. Rothblum

Published in: LIPIcs, Volume 94, 9th Innovations in Theoretical Computer Science Conference (ITCS 2018)


Abstract
Locally decodable codes (LDCs) and locally correctable codes (LCCs) are error-correcting codes in which individual bits of the message and codeword, respectively, can be recovered by querying only few bits from a noisy codeword. These codes have found numerous applications both in theory and in practice. A natural relaxation of LDCs, introduced by Ben-Sasson et al. (SICOMP, 2006), allows the decoder to reject (i.e., refuse to answer) in case it detects that the codeword is corrupt. They call such a decoder a relaxed decoder and construct a constant-query relaxed LDC with almost-linear blocklength, which is sub-exponentially better than what is known for (full-fledged) LDCs in the constant-query regime. We consider an analogous relaxation for local correction. Thus, a relaxed local corrector reads only few bits from a (possibly) corrupt codeword and either recovers the desired bit of the codeword, or rejects in case it detects a corruption. We give two constructions of relaxed LCCs in two regimes, where the first optimizes the query complexity and the second optimizes the rate: 1. Constant Query Complexity: A relaxed LCC with polynomial blocklength whose corrector only reads a constant number of bits of the codeword. This is a sub-exponential improvement over the best constant query (full-fledged) LCCs that are known. 2. Constant Rate: A relaxed LCC with constant rate (i.e., linear blocklength) with quasi-polylogarithmic query complexity. This is a nearly sub-exponential improvement over the query complexity of a recent (full-fledged) constant-rate LCC of Kopparty et al. (STOC, 2016).

Cite as

Tom Gur, Govind Ramnarayan, and Ron D. Rothblum. Relaxed Locally Correctable Codes. In 9th Innovations in Theoretical Computer Science Conference (ITCS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 94, pp. 27:1-27:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{gur_et_al:LIPIcs.ITCS.2018.27,
  author =	{Gur, Tom and Ramnarayan, Govind and Rothblum, Ron D.},
  title =	{{Relaxed Locally Correctable Codes}},
  booktitle =	{9th Innovations in Theoretical Computer Science Conference (ITCS 2018)},
  pages =	{27:1--27:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-060-6},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{94},
  editor =	{Karlin, Anna R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2018.27},
  URN =		{urn:nbn:de:0030-drops-83154},
  doi =		{10.4230/LIPIcs.ITCS.2018.27},
  annote =	{Keywords: Keywords and phrases Coding Theory, Locally Correctable Codes, Probabilistically Checkable Proofs}
}
Document
Proofs of Proximity for Distribution Testing

Authors: Alessandro Chiesa and Tom Gur

Published in: LIPIcs, Volume 94, 9th Innovations in Theoretical Computer Science Conference (ITCS 2018)


Abstract
Distribution testing is an area of property testing that studies algorithms that receive few samples from a probability distribution D and decide whether D has a certain property or is far (in total variation distance) from all distributions with that property. Most natural properties of distributions, however, require a large number of samples to test, which motivates the question of whether there are natural settings wherein fewer samples suffice. We initiate a study of proofs of proximity for properties of distributions. In their basic form, these proof systems consist of a tester that not only has sample access to a distribution but also explicit access to a proof string that depends on the distribution. We refer to these as NP distribution testers, or MA distribution testers if the tester is a probabilistic algorithm. We also study the more general notion of IP distribution testers, in which the tester interacts with an all-powerful untrusted prover. We investigate the power and limitations of proofs of proximity for distributions and chart a landscape that, surprisingly, is significantly different from that of proofs of proximity for functions. Our main results include showing that MA distribution testers can be quadratically stronger than standard distribution testers, but no stronger than that; in contrast, IP distribution testers can be exponentially stronger than standard distribution testers, but when restricted to public coins they can be at best quadratically stronger.

Cite as

Alessandro Chiesa and Tom Gur. Proofs of Proximity for Distribution Testing. In 9th Innovations in Theoretical Computer Science Conference (ITCS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 94, pp. 53:1-53:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{chiesa_et_al:LIPIcs.ITCS.2018.53,
  author =	{Chiesa, Alessandro and Gur, Tom},
  title =	{{Proofs of Proximity for Distribution Testing}},
  booktitle =	{9th Innovations in Theoretical Computer Science Conference (ITCS 2018)},
  pages =	{53:1--53:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-060-6},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{94},
  editor =	{Karlin, Anna R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2018.53},
  URN =		{urn:nbn:de:0030-drops-83114},
  doi =		{10.4230/LIPIcs.ITCS.2018.53},
  annote =	{Keywords: distribution testing, proofs of proximity, property testing}
}
Document
A Hierarchy Theorem for Interactive Proofs of Proximity

Authors: Tom Gur and Ron D. Rothblum

Published in: LIPIcs, Volume 67, 8th Innovations in Theoretical Computer Science Conference (ITCS 2017)


Abstract
The number of rounds, or round complexity, used in an interactive protocol is a fundamental resource. In this work we consider the significance of round complexity in the context of Interactive Proofs of Proximity (IPPs). Roughly speaking, IPPs are interactive proofs in which the verifier runs in sublinear time and is only required to reject inputs that are far from the language. Our main result is a round hierarchy theorem for IPPs, showing that the power of IPPs grows with the number of rounds. More specifically, we show that there exists a gap function g(r) = Theta(r^2) such that for every constant r \geq 1 there exists a language that (1) has a g(r)-round IPP with verification time t=t(n,r) but (2) does not have an r-round IPP with verification time t (or even verification time t'=\poly(t)). In fact, we prove a stronger result by exhibiting a single language L such that, for every constant r \geq 1, there is an O(r^2)-round IPP for L with t=n^{O(1/r)} verification time, whereas the verifier in any r-round IPP for L must run in time at least t^{100}. Moreover, we show an IPP for L with a poly-logarithmic number of rounds and only poly-logarithmic erification time, yielding a sub-exponential separation between the power of constant-round IPPs versus general (unbounded round) IPPs. From our hierarchy theorem we also derive implications to standard interactive proofs (in which the verifier can run in polynomial time). Specifically, we show that the round reduction technique of Babai and Moran (JCSS, 1988) is (almost) optimal among all blackbox transformations, and we show a connection to the algebrization framework of Aaronson and Wigderson (TOCT, 2009).

Cite as

Tom Gur and Ron D. Rothblum. A Hierarchy Theorem for Interactive Proofs of Proximity. In 8th Innovations in Theoretical Computer Science Conference (ITCS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 67, pp. 39:1-39:43, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{gur_et_al:LIPIcs.ITCS.2017.39,
  author =	{Gur, Tom and Rothblum, Ron D.},
  title =	{{A Hierarchy Theorem for Interactive Proofs of Proximity}},
  booktitle =	{8th Innovations in Theoretical Computer Science Conference (ITCS 2017)},
  pages =	{39:1--39:43},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-029-3},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{67},
  editor =	{Papadimitriou, Christos H.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2017.39},
  URN =		{urn:nbn:de:0030-drops-81536},
  doi =		{10.4230/LIPIcs.ITCS.2017.39},
  annote =	{Keywords: Complexity Theory, Property Testing, Interactive Proofs}
}
Document
An Adaptivity Hierarchy Theorem for Property Testing

Authors: Clément L. Canonne and Tom Gur

Published in: LIPIcs, Volume 79, 32nd Computational Complexity Conference (CCC 2017)


Abstract
Adaptivity is known to play a crucial role in property testing. In particular, there exist properties for which there is an exponential gap between the power of adaptive testing algorithms, wherein each query may be determined by the answers received to prior queries, and their non-adaptive counterparts, in which all queries are independent of answers obtained from previous queries. In this work, we investigate the role of adaptivity in property testing at a finer level. We first quantify the degree of adaptivity of a testing algorithm by considering the number of "rounds of adaptivity" it uses. More accurately, we say that a tester is k-(round) adaptive if it makes queries in k+1 rounds, where the queries in the i'th round may depend on the answers obtained in the previous i-1 rounds. Then, we ask the following question: Does the power of testing algorithms smoothly grow with the number of rounds of adaptivity? We provide a positive answer to the foregoing question by proving an adaptivity hierarchy theorem for property testing. Specifically, our main result shows that for every n in N and 0 <= k <= n^{0.99} there exists a property Pi_{n,k} of functions for which (1) there exists a k-adaptive tester for Pi_{n,k} with query complexity tilde O(k), yet (2) any (k-1)-adaptive tester for Pi_{n,k} must make Omega(n) queries. In addition, we show that such a qualitative adaptivity hierarchy can be witnessed for testing natural properties of graphs.

Cite as

Clément L. Canonne and Tom Gur. An Adaptivity Hierarchy Theorem for Property Testing. In 32nd Computational Complexity Conference (CCC 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 79, pp. 27:1-27:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{canonne_et_al:LIPIcs.CCC.2017.27,
  author =	{Canonne, Cl\'{e}ment L. and Gur, Tom},
  title =	{{An Adaptivity Hierarchy Theorem for Property Testing}},
  booktitle =	{32nd Computational Complexity Conference (CCC 2017)},
  pages =	{27:1--27:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-040-8},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{79},
  editor =	{O'Donnell, Ryan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2017.27},
  URN =		{urn:nbn:de:0030-drops-75164},
  doi =		{10.4230/LIPIcs.CCC.2017.27},
  annote =	{Keywords: Property Testing, Coding Theory, Hierarchy Theorems}
}
Document
Distribution Testing Lower Bounds via Reductions from Communication Complexity

Authors: Eric Blais, Clément L. Canonne, and Tom Gur

Published in: LIPIcs, Volume 79, 32nd Computational Complexity Conference (CCC 2017)


Abstract
We present a new methodology for proving distribution testing lower bounds, establishing a connection between distribution testing and the simultaneous message passing (SMP) communication model. Extending the framework of Blais, Brody, and Matulef (Computational Complexity, 2012), we show a simple way to reduce (private-coin) SMP problems to distribution testing problems. This method allows us to prove new distribution testing lower bounds, as well as to provide simple proofs of known lower bounds. Our main result is concerned with testing identity to a specific distribution p, given as a parameter. In a recent and influential work, Valiant and Valiant (FOCS, 2014) showed that the sample complexity of the aforementioned problem is closely related to the 2/3-quasinorm of p. We obtain alternative bounds on the complexity of this problem in terms of an arguably more intuitive measure and using simpler proofs. More specifically, we prove that the sample complexity is essentially determined by a fundamental operator in the theory of interpolation of Banach spaces, known as Peetre's K-functional. We show that this quantity is closely related to the size of the effective support of p (loosely speaking, the number of supported elements that constitute the vast majority of the mass of p). This result, in turn, stems from an unexpected connection to functional analysis and refined concentration of measure inequalities, which arise naturally in our reduction.

Cite as

Eric Blais, Clément L. Canonne, and Tom Gur. Distribution Testing Lower Bounds via Reductions from Communication Complexity. In 32nd Computational Complexity Conference (CCC 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 79, pp. 28:1-28:40, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{blais_et_al:LIPIcs.CCC.2017.28,
  author =	{Blais, Eric and Canonne, Cl\'{e}ment L. and Gur, Tom},
  title =	{{Distribution Testing Lower Bounds via Reductions from Communication Complexity}},
  booktitle =	{32nd Computational Complexity Conference (CCC 2017)},
  pages =	{28:1--28:40},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-040-8},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{79},
  editor =	{O'Donnell, Ryan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2017.28},
  URN =		{urn:nbn:de:0030-drops-75366},
  doi =		{10.4230/LIPIcs.CCC.2017.28},
  annote =	{Keywords: Distribution testing, communication complexity, lower bounds, simultaneous message passing, functional analysis}
}
Document
Strong Locally Testable Codes with Relaxed Local Decoders

Authors: Oded Goldreich, Tom Gur, and Ilan Komargodski

Published in: LIPIcs, Volume 33, 30th Conference on Computational Complexity (CCC 2015)


Abstract
Locally testable codes (LTCs) are error-correcting codes that admit very efficient codeword tests. An LTC is said to be strong if it has a proximity-oblivious tester; that is, a tester that makes only a constant number of queries and reject non-codewords with probability that depends solely on their distance from the code. Locally decodable codes (LDCs) are complimentary to LTCs. While the latter allow for highly efficient rejection of strings that are far from being codewords, LDCs allow for highly efficient recovery of individual bits of the information that is encoded in strings that are close to being codewords. While there are known constructions of strong-LTCs with nearly-linear length, the existence of a constant-query LDC with polynomial length is a major open problem. In an attempt to bypass this barrier, Ben-Sasson et al. (SICOMP 2006) introduced a natural relaxation of local decodability, called relaxed-LDCs. This notion requires local recovery of nearly all individual information-bits, yet allows for recovery-failure (but not error) on the rest. Ben-Sasson et al. constructed a constant-query relaxed-LDC with nearly-linear length (i.e., length k^(1 + alpha) for an arbitrarily small constant alpha>0, where k is the dimension of the code). This work focuses on obtaining strong testability and relaxed decodability simultaneously. We construct a family of binary linear codes of nearly-linear length that are both strong-LTCs (with one-sided error) and constant-query relaxed-LDCs. This improves upon the previously known constructions, which obtain either weak LTCs or require polynomial length. Our construction heavily relies on tensor codes and PCPs. In particular, we provide strong canonical PCPs of proximity for membership in any linear code with constant rate and relative distance. Loosely speaking, these are PCPs of proximity wherein the verifier is proximity oblivious (similarly to strong-LTCs and every valid statement has a unique canonical proof. Furthermore, the verifier is required to reject non-canonical proofs (even for valid statements). As an application, we improve the best known separation result between the complexity of decision and verification in the setting of property testing.

Cite as

Oded Goldreich, Tom Gur, and Ilan Komargodski. Strong Locally Testable Codes with Relaxed Local Decoders. In 30th Conference on Computational Complexity (CCC 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 33, pp. 1-41, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{goldreich_et_al:LIPIcs.CCC.2015.1,
  author =	{Goldreich, Oded and Gur, Tom and Komargodski, Ilan},
  title =	{{Strong Locally Testable Codes with Relaxed Local Decoders}},
  booktitle =	{30th Conference on Computational Complexity (CCC 2015)},
  pages =	{1--41},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-81-1},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{33},
  editor =	{Zuckerman, David},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2015.1},
  URN =		{urn:nbn:de:0030-drops-50507},
  doi =		{10.4230/LIPIcs.CCC.2015.1},
  annote =	{Keywords: Locally Testable Codes, Locally Decodable Codes, PCPs of Proximity}
}
  • Refine by Author
  • 8 Gur, Tom
  • 3 Rothblum, Ron D.
  • 2 Canonne, Clément L.
  • 2 Goldreich, Oded
  • 1 Asadi, Vahid R.
  • Show More...

  • Refine by Classification
  • 2 Theory of computation → Expander graphs and randomness extractors
  • 1 Theory of computation → Error-correcting codes
  • 1 Theory of computation → Interactive computation
  • 1 Theory of computation → Interactive proof systems
  • 1 Theory of computation → Probabilistic computation

  • Refine by Keyword
  • 2 Probabilistically checkable proofs
  • 2 Property Testing
  • 2 property testing
  • 1 Algorithmic coding theory
  • 1 Coding Theory
  • Show More...

  • Refine by Type
  • 12 document

  • Refine by Publication Year
  • 3 2017
  • 3 2018
  • 2 2019
  • 2 2020
  • 1 2015
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail