Search Results

Documents authored by Goldreich, Oded


Document
On Interactive Proofs of Proximity with Proof-Oblivious Queries

Authors: Oded Goldreich, Guy N. Rothblum, and Tal Skverer

Published in: LIPIcs, Volume 251, 14th Innovations in Theoretical Computer Science Conference (ITCS 2023)


Abstract
Interactive proofs of proximity (IPPs) offer ultra-fast approximate verification of assertions regarding their input, where ultra-fast means that only a small portion of the input is read and approximate verification is analogous to the notion of approximate decision that underlies property testing. Specifically, in an IPP, the prover can make the verifier accept each input in the property, but cannot fool the verifier into accepting an input that is far from the property (except for with small probability). The verifier in an IPP system engages in two very different types of activities: interacting with an untrusted prover, and querying its input. The definition allows for arbitrary coordination between these two activities, but keeping them separate is both conceptually interesting and necessary for important applications such as addressing temporal considerations (i.e., at what time is each of the services available) and facilitating the construction of zero-knowledge schemes. In this work we embark on a systematic study of IPPs with proof-oblivious queries, where the queries should not be affected by the interaction with the prover. We assign the query and interaction activities to separate modules, and consider different limitations on their coordination. The most strict limitation requires these activities to be totally isolated from one another; they just feed their views to a separate deciding module. We show that such systems can be efficiently emulated by standard testers. Going to the other extreme, we only disallow information to flow from the interacting module to the querying module, but allow free information flow in the other direction. We show that extremely efficient one-round (i.e., two-message) systems of such type can be used to verify properties that are extremely hard to test (without the help of a prover). That is, the complexity of verifying can be polylogarithmic in the complexity of testing. This stands in contrast the MAPs (viewed as 1/2-round systems) in which proof-oblivious queries are as limited as our isolated model. Our focus is on an intermediate model that allows shared randomness between the querying and interacting modules but no information flow between them. In this case we show that 1-round systems are efficiently emulated by standard testers but 3/2-round systems of extremely low complexity exist for properties that are extremely hard to test. One additional result about this model is that it can efficiently emulate any IPP for any property of low-degree polynomials.

Cite as

Oded Goldreich, Guy N. Rothblum, and Tal Skverer. On Interactive Proofs of Proximity with Proof-Oblivious Queries. In 14th Innovations in Theoretical Computer Science Conference (ITCS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 251, pp. 59:1-59:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{goldreich_et_al:LIPIcs.ITCS.2023.59,
  author =	{Goldreich, Oded and Rothblum, Guy N. and Skverer, Tal},
  title =	{{On Interactive Proofs of Proximity with Proof-Oblivious Queries}},
  booktitle =	{14th Innovations in Theoretical Computer Science Conference (ITCS 2023)},
  pages =	{59:1--59:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-263-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{251},
  editor =	{Tauman Kalai, Yael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2023.59},
  URN =		{urn:nbn:de:0030-drops-175625},
  doi =		{10.4230/LIPIcs.ITCS.2023.59},
  annote =	{Keywords: Complexity Theory, Property Testing, Interactive Proofs, Interactive Proofs of Proximity, Proof-Oblivious Queries}
}
Document
Randomness Extraction from Somewhat Dependent Sources

Authors: Marshall Ball, Oded Goldreich, and Tal Malkin

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
We initiate a comprehensive study of the question of randomness extractions from two somewhat dependent sources of defective randomness. Specifically, we present three natural models, which are based on different natural perspectives on the notion of bounded dependency between a pair of distributions. Going from the more restricted model to the less restricted one, our models and main results are as follows. 1) Bounded dependence as bounded coordination: Here we consider pairs of distributions that arise from independent random processes that are applied to the outcome of a single global random source, which may be viewed as a mechanism of coordination (which is adversarial from our perspective). We show that if the min-entropy of each of the two outcomes is larger than the length of the global source, then extraction is possible (and is, in fact, feasible). We stress that the extractor has no access to the global random source nor to the internal randomness that the two processes use, but rather gets only the two dependent outcomes. This model is equivalent to a setting in which the two outcomes are generated by two independent sources, but then each outcome is modified based on limited leakage (equiv., communication) between the two sources. (Here this leakage is measured in terms of the number of bits that were communicated, but in the next model we consider the actual influence of this leakage.) 2) Bounded dependence as bounded cross influence: Here we consider pairs of outcomes that are produced by a pair of sources such that each source has bounded (worst-case) influence on the outcome of the other source. We stress that the extractor has no access to the randomness that the two processes use, but rather gets only the two dependent outcomes. We show that, while (proper) randomness extraction is impossible in this case, randomness condensing is possible and feasible; specifically, the randomness deficiency of condensing is linear in our measure of cross influence, and this upper bound is tight. We also discuss various applications of such condensers, including for cryptography, standard randomized algorithms, and sublinear-time algorithms, while pointing out their benefit over using a seeded (single-source) extractor. 3) Bounded dependence as bounded mutual information: Due to the average-case nature of mutual information, here there is a trade-off between the error (or deviation) probability of the extracted output and its randomness deficiency. Loosely speaking, for joint distributions of mutual information t, we can condense with randomness deficiency O(t/ε) and error ε, and this trade-off is optimal. All positive results are obtained by using a standard two-source extractor (or condenser) as a black-box.

Cite as

Marshall Ball, Oded Goldreich, and Tal Malkin. Randomness Extraction from Somewhat Dependent Sources. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 12:1-12:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{ball_et_al:LIPIcs.ITCS.2022.12,
  author =	{Ball, Marshall and Goldreich, Oded and Malkin, Tal},
  title =	{{Randomness Extraction from Somewhat Dependent Sources}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{12:1--12:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.12},
  URN =		{urn:nbn:de:0030-drops-156081},
  doi =		{10.4230/LIPIcs.ITCS.2022.12},
  annote =	{Keywords: Randomness Extraction, min-entropy, mutual information, two-source extractors, two-source condenser}
}
Document
Testing Distributions of Huge Objects

Authors: Oded Goldreich and Dana Ron

Published in: LIPIcs, Volume 215, 13th Innovations in Theoretical Computer Science Conference (ITCS 2022)


Abstract
We initiate a study of a new model of property testing that is a hybrid of testing properties of distributions and testing properties of strings. Specifically, the new model refers to testing properties of distributions, but these are distributions over huge objects (i.e., very long strings). Accordingly, the model accounts for the total number of local probes into these objects (resp., queries to the strings) as well as for the distance between objects (resp., strings). Specifically, the distance between distributions is defined as the earth mover’s distance with respect to the relative Hamming distance between strings. We study the query complexity of testing in this new model, focusing on three directions. First, we try to relate the query complexity of testing properties in the new model to the sample complexity of testing these properties in the standard distribution testing model. Second, we consider the complexity of testing properties that arise naturally in the new model (e.g., distributions that capture random variations of fixed strings). Third, we consider the complexity of testing properties that were extensively studied in the standard distribution testing model: Two such cases are uniform distributions and pairs of identical distributions, where we obtain the following results. - Testing whether a distribution over n-bit long strings is uniform on some set of size m can be done with query complexity Õ(m/ε³), where ε > (log₂m)/n is the proximity parameter. - Testing whether two distribution over n-bit long strings that have support size at most m are identical can be done with query complexity Õ(m^{2/3}/ε³). Both upper bounds are quite tight; that is, for ε = Ω(1), the first task requires Ω(m^c) queries for any c < 1 and n = ω(log m), whereas the second task requires Ω(m^{2/3}) queries. Note that the query complexity of the first task is higher than the sample complexity of the corresponding task in the standard distribution testing model, whereas in the case of the second task the bounds almost match.

Cite as

Oded Goldreich and Dana Ron. Testing Distributions of Huge Objects. In 13th Innovations in Theoretical Computer Science Conference (ITCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 215, pp. 78:1-78:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{goldreich_et_al:LIPIcs.ITCS.2022.78,
  author =	{Goldreich, Oded and Ron, Dana},
  title =	{{Testing Distributions of Huge Objects}},
  booktitle =	{13th Innovations in Theoretical Computer Science Conference (ITCS 2022)},
  pages =	{78:1--78:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-217-4},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{215},
  editor =	{Braverman, Mark},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2022.78},
  URN =		{urn:nbn:de:0030-drops-156747},
  doi =		{10.4230/LIPIcs.ITCS.2022.78},
  annote =	{Keywords: Property Testing, Distributions}
}
Document
Robustly Self-Ordered Graphs: Constructions and Applications to Property Testing

Authors: Oded Goldreich and Avi Wigderson

Published in: LIPIcs, Volume 200, 36th Computational Complexity Conference (CCC 2021)


Abstract
A graph G is called self-ordered (a.k.a asymmetric) if the identity permutation is its only automorphism. Equivalently, there is a unique isomorphism from G to any graph that is isomorphic to G. We say that G = (V,E) is robustly self-ordered if the size of the symmetric difference between E and the edge-set of the graph obtained by permuting V using any permutation π:V → V is proportional to the number of non-fixed-points of π. In this work, we initiate the study of the structure, construction and utility of robustly self-ordered graphs. We show that robustly self-ordered bounded-degree graphs exist (in abundance), and that they can be constructed efficiently, in a strong sense. Specifically, given the index of a vertex in such a graph, it is possible to find all its neighbors in polynomial-time (i.e., in time that is poly-logarithmic in the size of the graph). We provide two very different constructions, in tools and structure. The first, a direct construction, is based on proving a sufficient condition for robust self-ordering, which requires that an auxiliary graph is expanding. The second construction is iterative, boosting the property of robust self-ordering from smaller to larger graphs. Structuraly, the first construction always yields expanding graphs, while the second construction may produce graphs that have many tiny (sub-logarithmic) connected components. We also consider graphs of unbounded degree, seeking correspondingly unbounded robustness parameters. We again demonstrate that such graphs (of linear degree) exist (in abundance), and that they can be constructed efficiently, in a strong sense. This turns out to require very different tools. Specifically, we show that the construction of such graphs reduces to the construction of non-malleable two-source extractors (with very weak parameters but with some additional natural features). We demonstrate that robustly self-ordered bounded-degree graphs are useful towards obtaining lower bounds on the query complexity of testing graph properties both in the bounded-degree and the dense graph models. Indeed, their robustness offers efficient, local and distance preserving reductions from testing problems on ordered structures (like sequences) to the unordered (effectively unlabeled) graphs. One of the results that we obtain, via such a reduction, is a subexponential separation between the query complexities of testing and tolerant testing of graph properties in the bounded-degree graph model.

Cite as

Oded Goldreich and Avi Wigderson. Robustly Self-Ordered Graphs: Constructions and Applications to Property Testing. In 36th Computational Complexity Conference (CCC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 200, pp. 12:1-12:74, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{goldreich_et_al:LIPIcs.CCC.2021.12,
  author =	{Goldreich, Oded and Wigderson, Avi},
  title =	{{Robustly Self-Ordered Graphs: Constructions and Applications to Property Testing}},
  booktitle =	{36th Computational Complexity Conference (CCC 2021)},
  pages =	{12:1--12:74},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-193-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{200},
  editor =	{Kabanets, Valentine},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2021.12},
  URN =		{urn:nbn:de:0030-drops-142867},
  doi =		{10.4230/LIPIcs.CCC.2021.12},
  annote =	{Keywords: Asymmetric graphs, expanders, testing graph properties, two-source extractors, non-malleable extractors, coding theory, tolerant testing, random graphs}
}
Document
Communication Complexity with Defective Randomness

Authors: Marshall Ball, Oded Goldreich, and Tal Malkin

Published in: LIPIcs, Volume 200, 36th Computational Complexity Conference (CCC 2021)


Abstract
Starting with the two standard model of randomized communication complexity, we study the communication complexity of functions when the protocol has access to a defective source of randomness. Specifically, we consider both the public-randomness and private-randomness cases, while replacing the commonly postulated perfect randomness with distributions over 𝓁 bit strings that have min-entropy at least k ≤ 𝓁. We present general upper and lower bounds on the communication complexity in these cases, where the bounds are typically linear in 𝓁-k and also depend on the size of the fooling set for the function being computed and on its standard randomized complexity.

Cite as

Marshall Ball, Oded Goldreich, and Tal Malkin. Communication Complexity with Defective Randomness. In 36th Computational Complexity Conference (CCC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 200, pp. 14:1-14:10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{ball_et_al:LIPIcs.CCC.2021.14,
  author =	{Ball, Marshall and Goldreich, Oded and Malkin, Tal},
  title =	{{Communication Complexity with Defective Randomness}},
  booktitle =	{36th Computational Complexity Conference (CCC 2021)},
  pages =	{14:1--14:10},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-193-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{200},
  editor =	{Kabanets, Valentine},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2021.14},
  URN =		{urn:nbn:de:0030-drops-142886},
  doi =		{10.4230/LIPIcs.CCC.2021.14},
  annote =	{Keywords: Randomized Communication Complexity, Randomness Extraction, Min-Entropy}
}
Document
Every Set in P Is Strongly Testable Under a Suitable Encoding

Authors: Irit Dinur, Oded Goldreich, and Tom Gur

Published in: LIPIcs, Volume 124, 10th Innovations in Theoretical Computer Science Conference (ITCS 2019)


Abstract
We show that every set in P is strongly testable under a suitable encoding. By "strongly testable" we mean having a (proximity oblivious) tester that makes a constant number of queries and rejects with probability that is proportional to the distance of the tested object from the property. By a "suitable encoding" we mean one that is polynomial-time computable and invertible. This result stands in contrast to the known fact that some sets in P are extremely hard to test, providing another demonstration of the crucial role of representation in the context of property testing. The testing result is proved by showing that any set in P has a strong canonical PCP, where canonical means that (for yes-instances) there exists a single proof that is accepted with probability 1 by the system, whereas all other potential proofs are rejected with probability proportional to their distance from this proof. In fact, we show that UP equals the class of sets having strong canonical PCPs (of logarithmic randomness), whereas the class of sets having strong canonical PCPs with polynomial proof length equals "unambiguous- MA". Actually, for the testing result, we use a PCP-of-Proximity version of the foregoing notion and an analogous positive result (i.e., strong canonical PCPPs of logarithmic randomness for any set in UP).

Cite as

Irit Dinur, Oded Goldreich, and Tom Gur. Every Set in P Is Strongly Testable Under a Suitable Encoding. In 10th Innovations in Theoretical Computer Science Conference (ITCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 124, pp. 30:1-30:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{dinur_et_al:LIPIcs.ITCS.2019.30,
  author =	{Dinur, Irit and Goldreich, Oded and Gur, Tom},
  title =	{{Every Set in P Is Strongly Testable Under a Suitable Encoding}},
  booktitle =	{10th Innovations in Theoretical Computer Science Conference (ITCS 2019)},
  pages =	{30:1--30:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-095-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{124},
  editor =	{Blum, Avrim},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2019.30},
  URN =		{urn:nbn:de:0030-drops-101234},
  doi =		{10.4230/LIPIcs.ITCS.2019.30},
  annote =	{Keywords: Probabilistically checkable proofs, property testing}
}
Document
The Subgraph Testing Model

Authors: Oded Goldreich and Dana Ron

Published in: LIPIcs, Volume 124, 10th Innovations in Theoretical Computer Science Conference (ITCS 2019)


Abstract
We initiate a study of testing properties of graphs that are presented as subgraphs of a fixed (or an explicitly given) graph. The tester is given free access to a base graph G=([n],E), and oracle access to a function f:E -> {0,1} that represents a subgraph of G. The tester is required to distinguish between subgraphs that posses a predetermined property and subgraphs that are far from possessing this property. We focus on bounded-degree base graphs and on the relation between testing graph properties in the subgraph model and testing the same properties in the bounded-degree graph model. We identify cases in which testing is significantly easier in one model than in the other as well as cases in which testing has approximately the same complexity in both models. Our proofs are based on the design and analysis of efficient testers and on the establishment of query-complexity lower bounds.

Cite as

Oded Goldreich and Dana Ron. The Subgraph Testing Model. In 10th Innovations in Theoretical Computer Science Conference (ITCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 124, pp. 37:1-37:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{goldreich_et_al:LIPIcs.ITCS.2019.37,
  author =	{Goldreich, Oded and Ron, Dana},
  title =	{{The Subgraph Testing Model}},
  booktitle =	{10th Innovations in Theoretical Computer Science Conference (ITCS 2019)},
  pages =	{37:1--37:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-095-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{124},
  editor =	{Blum, Avrim},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2019.37},
  URN =		{urn:nbn:de:0030-drops-101308},
  doi =		{10.4230/LIPIcs.ITCS.2019.37},
  annote =	{Keywords: Property Testing, Graph Properties}
}
Document
Simple Doubly-Efficient Interactive Proof Systems for Locally-Characterizable Sets

Authors: Oded Goldreich and Guy N. Rothblum

Published in: LIPIcs, Volume 94, 9th Innovations in Theoretical Computer Science Conference (ITCS 2018)


Abstract
A proof system is called doubly-efficient if the prescribed prover strategy can be implemented in polynomial-time and the verifier's strategy can be implemented in almost-linear-time. We present direct constructions of doubly-efficient interactive proof systems for problems in P that are believed to have relatively high complexity. Specifically, such constructions are presented for t-CLIQUE and t-SUM. In addition, we present a generic construction of such proof systems for a natural class that contains both problems and is in NC (and also in SC). The proof systems presented by us are significantly simpler than the proof systems presented by Goldwasser, Kalai and Rothblum (JACM, 2015), let alone those presented by Reingold, Rothblum, and Rothblum (STOC, 2016), and can be implemented using a smaller number of rounds.

Cite as

Oded Goldreich and Guy N. Rothblum. Simple Doubly-Efficient Interactive Proof Systems for Locally-Characterizable Sets. In 9th Innovations in Theoretical Computer Science Conference (ITCS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 94, pp. 18:1-18:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{goldreich_et_al:LIPIcs.ITCS.2018.18,
  author =	{Goldreich, Oded and Rothblum, Guy N.},
  title =	{{Simple Doubly-Efficient Interactive Proof Systems for Locally-Characterizable Sets}},
  booktitle =	{9th Innovations in Theoretical Computer Science Conference (ITCS 2018)},
  pages =	{18:1--18:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-060-6},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{94},
  editor =	{Karlin, Anna R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2018.18},
  URN =		{urn:nbn:de:0030-drops-83279},
  doi =		{10.4230/LIPIcs.ITCS.2018.18},
  annote =	{Keywords: Interactive proofs}
}
Document
Strong Locally Testable Codes with Relaxed Local Decoders

Authors: Oded Goldreich, Tom Gur, and Ilan Komargodski

Published in: LIPIcs, Volume 33, 30th Conference on Computational Complexity (CCC 2015)


Abstract
Locally testable codes (LTCs) are error-correcting codes that admit very efficient codeword tests. An LTC is said to be strong if it has a proximity-oblivious tester; that is, a tester that makes only a constant number of queries and reject non-codewords with probability that depends solely on their distance from the code. Locally decodable codes (LDCs) are complimentary to LTCs. While the latter allow for highly efficient rejection of strings that are far from being codewords, LDCs allow for highly efficient recovery of individual bits of the information that is encoded in strings that are close to being codewords. While there are known constructions of strong-LTCs with nearly-linear length, the existence of a constant-query LDC with polynomial length is a major open problem. In an attempt to bypass this barrier, Ben-Sasson et al. (SICOMP 2006) introduced a natural relaxation of local decodability, called relaxed-LDCs. This notion requires local recovery of nearly all individual information-bits, yet allows for recovery-failure (but not error) on the rest. Ben-Sasson et al. constructed a constant-query relaxed-LDC with nearly-linear length (i.e., length k^(1 + alpha) for an arbitrarily small constant alpha>0, where k is the dimension of the code). This work focuses on obtaining strong testability and relaxed decodability simultaneously. We construct a family of binary linear codes of nearly-linear length that are both strong-LTCs (with one-sided error) and constant-query relaxed-LDCs. This improves upon the previously known constructions, which obtain either weak LTCs or require polynomial length. Our construction heavily relies on tensor codes and PCPs. In particular, we provide strong canonical PCPs of proximity for membership in any linear code with constant rate and relative distance. Loosely speaking, these are PCPs of proximity wherein the verifier is proximity oblivious (similarly to strong-LTCs and every valid statement has a unique canonical proof. Furthermore, the verifier is required to reject non-canonical proofs (even for valid statements). As an application, we improve the best known separation result between the complexity of decision and verification in the setting of property testing.

Cite as

Oded Goldreich, Tom Gur, and Ilan Komargodski. Strong Locally Testable Codes with Relaxed Local Decoders. In 30th Conference on Computational Complexity (CCC 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 33, pp. 1-41, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{goldreich_et_al:LIPIcs.CCC.2015.1,
  author =	{Goldreich, Oded and Gur, Tom and Komargodski, Ilan},
  title =	{{Strong Locally Testable Codes with Relaxed Local Decoders}},
  booktitle =	{30th Conference on Computational Complexity (CCC 2015)},
  pages =	{1--41},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-81-1},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{33},
  editor =	{Zuckerman, David},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2015.1},
  URN =		{urn:nbn:de:0030-drops-50507},
  doi =		{10.4230/LIPIcs.CCC.2015.1},
  annote =	{Keywords: Locally Testable Codes, Locally Decodable Codes, PCPs of Proximity}
}
Document
On Randomness Extraction in AC0

Authors: Oded Goldreich, Emanuele Viola, and Avi Wigderson

Published in: LIPIcs, Volume 33, 30th Conference on Computational Complexity (CCC 2015)


Abstract
We consider randomness extraction by AC0 circuits. The main parameter, n, is the length of the source, and all other parameters are functions of it. The additional extraction parameters are the min-entropy bound k=k(n), the seed length r=r(n), the output length m=m(n), and the (output) deviation bound epsilon=epsilon(n). For k <=e n/\log^(omega(1))(n), we show that AC0-extraction is possible if and only if m/r <= 1+ poly(log(n)) * k/n; that is, the extraction rate m/r exceeds the trivial rate (of one) by an additive amount that is proportional to the min-entropy rate k/n. In particular, non-trivial AC0-extraction (i.e., m >= r+1) is possible if and only if k * r > n/poly(log(n)). For k >= n/log^(O(1))(n), we show that AC0-extraction of r+Omega(r) bits is possible when r=O(log(n)), but leave open the question of whether more bits can be extracted in this case. The impossibility result is for constant epsilon, and the possibility result supports epsilon=1/poly(n). The impossibility result is for (possibly) non-uniform AC0, whereas the possibility result hold for uniform AC0. All our impossibility results hold even for the model of bit-fixing sources, where k coincides with the number of non-fixed (i.e., random) bits. We also consider deterministic AC0 extraction from various classes of restricted sources. In particular, for any constant $\delta>0$, we give explicit AC0 extractors for poly(1/delta) independent sources that are each of min-entropy rate delta; and four sources suffice for delta=0.99. Also, we give non-explicit AC0 extractors for bit-fixing sources of entropy rate 1/poly(log(n)) (i.e., having n/poly(log(n)) unfixed bits). This shows that the known analysis of the "restriction method" (for making a circuit constant by fixing as few variables as possible) is tight for AC0 even if the restriction is picked deterministically depending on the circuit.

Cite as

Oded Goldreich, Emanuele Viola, and Avi Wigderson. On Randomness Extraction in AC0. In 30th Conference on Computational Complexity (CCC 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 33, pp. 601-668, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{goldreich_et_al:LIPIcs.CCC.2015.601,
  author =	{Goldreich, Oded and Viola, Emanuele and Wigderson, Avi},
  title =	{{On Randomness Extraction in AC0}},
  booktitle =	{30th Conference on Computational Complexity (CCC 2015)},
  pages =	{601--668},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-81-1},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{33},
  editor =	{Zuckerman, David},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2015.601},
  URN =		{urn:nbn:de:0030-drops-50515},
  doi =		{10.4230/LIPIcs.CCC.2015.601},
  annote =	{Keywords: AC0, randomness extractors, general min-entropy sources, block sources, bit-fixing sources, multiple-source extraction}
}
Document
On Multiple Input Problems in Property Testing

Authors: Oded Goldreich

Published in: LIPIcs, Volume 28, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2014)


Abstract
We consider three types of multiple input problems in the context of property testing. Specifically, for a property Pi (of n-bit long strings), a proximity parameter epsilon, and an integer m, we consider the following problems: (1) Direct m-Sum Problem for Pi and epsilon: Given a sequence of m inputs, output a sequence of m bits such that for each i in [m] the i-th bit satisfies the requirements from an epsilon-tester for Pi regarding the i-th input; that is, for each i, the i-th output bit should be 1 (w.p. at least 2/3) if the i-th input is in Pi, and should be 0 (w.p. at least 2/3) if the i-th input is epsilon-far from Pi. (2) Direct m-Product Problem for Pi and epsilon: Given a sequence of m inputs, output 1 (w.p. at least 2/3) if all inputs are in Pi, and output 0 (w.p. at least 2/3) if at least one of the inputs is epsilon-far from Pi. (3) The m-Concatenation Problem for Pi and epsilon: Here one is required to epsilon-test the m-product of Pi; that is, the property that consists of the m-wise Cartesian product of Pi. We show that the query complexity of the first two problems is Theta(m) times the query complexity of epsilon-testing Pi, whereas (except in pathological cases) the query complexity of the third problem is almost of the same order of magnitude as the query complexity of the problem of epsilon-testing Pi. All upper bounds are shown via efficient reductions. We also consider the nonadaptive and one-sided error versions of these problems. The only significant deviation from the picture in the general (adaptive and two-sided error) model is that the one-sided error query complexity of the Direct Product Problem equals Theta(m) times the (two-sided error) query complexity of epsilon-testing Pi plus Theta(1) times the one-sided error query complexity of epsilon-testing Pi.

Cite as

Oded Goldreich. On Multiple Input Problems in Property Testing. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2014). Leibniz International Proceedings in Informatics (LIPIcs), Volume 28, pp. 704-720, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@InProceedings{goldreich:LIPIcs.APPROX-RANDOM.2014.704,
  author =	{Goldreich, Oded},
  title =	{{On Multiple Input Problems in Property Testing}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2014)},
  pages =	{704--720},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-74-3},
  ISSN =	{1868-8969},
  year =	{2014},
  volume =	{28},
  editor =	{Jansen, Klaus and Rolim, Jos\'{e} and Devanur, Nikhil R. and Moore, Cristopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX-RANDOM.2014.704},
  URN =		{urn:nbn:de:0030-drops-47336},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2014.704},
  annote =	{Keywords: Property Testing, Direct Sum Theorems, Direct Product Theorems, Adaptive vs Nonadaptive queries, One-Sided Error vs Two-Sided Error}
}
Document
Approximating Average Parameters of Graphs

Authors: Oded Goldreich and Dana Ron

Published in: Dagstuhl Seminar Proceedings, Volume 5291, Sublinear Algorithms (2006)


Abstract
Inspired by Feige (36th STOC, 2004), we initiate a study of sublinear randomized algorithms for approximating average parameters of a graph. Specifically, we consider the average degree of a graph and the average distance between pairs of vertices in a graph. Since our focus is on sublinear algorithms, these algorithms access the input graph via queries to an adequate oracle. We consider two types of queries. The first type is standard neighborhood queries (i.e., what is the i'th neighbor of vertex v?), whereas the second type are queries regarding the quantities that we need to find the average of (i.e., what is the degree of vertex v? and what is the distance between u and v, respectively). Loosely speaking, our results indicate a difference between the two problems: For approximating the average degree, the standard neighbor queries suffice and in fact are preferable to degree queries. In contrast, for approximating average distances, the standard neighbor queries are of little help whereas distance queries are crucial.

Cite as

Oded Goldreich and Dana Ron. Approximating Average Parameters of Graphs. In Sublinear Algorithms. Dagstuhl Seminar Proceedings, Volume 5291, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{goldreich_et_al:DagSemProc.05291.2,
  author =	{Goldreich, Oded and Ron, Dana},
  title =	{{Approximating Average Parameters of Graphs}},
  booktitle =	{Sublinear Algorithms},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5291},
  editor =	{Artur Czumaj and S. Muthu Muthukrishnan and Ronitt Rubinfeld and Christian Sohler},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05291.2},
  URN =		{urn:nbn:de:0030-drops-5531},
  doi =		{10.4230/DagSemProc.05291.2},
  annote =	{Keywords: Graph parameters, degree, distance}
}
Document
Contemplations on Testing Graph Properties

Authors: Oded Goldreich

Published in: Dagstuhl Seminar Proceedings, Volume 5291, Sublinear Algorithms (2006)


Abstract
This note documents two programmatic comments regarding testing graph properties, which I made during the workshop. The first comment advocates paying more attention to the dependence of the tester's complexity on the proximity parameter. The second comment advocates paying more attention to the question of testing general graphs (rather than dense or bounded-degree ones). In addition, this note includes a suggestion to view property testing within the framework of promise problems.

Cite as

Oded Goldreich. Contemplations on Testing Graph Properties. In Sublinear Algorithms. Dagstuhl Seminar Proceedings, Volume 5291, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006)


Copy BibTex To Clipboard

@InProceedings{goldreich:DagSemProc.05291.3,
  author =	{Goldreich, Oded},
  title =	{{Contemplations on Testing Graph Properties}},
  booktitle =	{Sublinear Algorithms},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2006},
  volume =	{5291},
  editor =	{Artur Czumaj and S. Muthu Muthukrishnan and Ronitt Rubinfeld and Christian Sohler},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagSemProc.05291.3},
  URN =		{urn:nbn:de:0030-drops-5552},
  doi =		{10.4230/DagSemProc.05291.3},
  annote =	{Keywords: Property testing, graph properties}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail