8 Search Results for "Lachish, Oded"


Document
Asymptotically-Good RLCCs with (log n)^(2+o(1)) Queries

Authors: Gil Cohen and Tal Yankovitz

Published in: LIPIcs, Volume 300, 39th Computational Complexity Conference (CCC 2024)


Abstract
Recently, Kumar and Mon reached a significant milestone by constructing asymptotically good relaxed locally correctable codes (RLCCs) with poly-logarithmic query complexity. Specifically, they constructed n-bit RLCCs with O(log^{69} n) queries. Their construction relies on a clever reduction to locally testable codes (LTCs), capitalizing on recent breakthrough works in LTCs. As for lower bounds, Gur and Lachish (SICOMP 2021) proved that any asymptotically-good RLCC must make Ω̃(√{log n}) queries. Hence emerges the intriguing question regarding the identity of the least value 1/2 ≤ e ≤ 69 for which asymptotically-good RLCCs with query complexity (log n)^{e+o(1)} exist. In this work, we make substantial progress in narrowing the gap by devising asymptotically-good RLCCs with a query complexity of (log n)^{2+o(1)}. The key insight driving our work lies in recognizing that the strong guarantee of local testability overshoots the requirements for the Kumar-Mon reduction. In particular, we prove that we can replace the LTCs by "vanilla" expander codes which indeed have the necessary property: local testability in the code’s vicinity.

Cite as

Gil Cohen and Tal Yankovitz. Asymptotically-Good RLCCs with (log n)^(2+o(1)) Queries. In 39th Computational Complexity Conference (CCC 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 300, pp. 8:1-8:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{cohen_et_al:LIPIcs.CCC.2024.8,
  author =	{Cohen, Gil and Yankovitz, Tal},
  title =	{{Asymptotically-Good RLCCs with (log n)^(2+o(1)) Queries}},
  booktitle =	{39th Computational Complexity Conference (CCC 2024)},
  pages =	{8:1--8:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-331-7},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{300},
  editor =	{Santhanam, Rahul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2024.8},
  URN =		{urn:nbn:de:0030-drops-204045},
  doi =		{10.4230/LIPIcs.CCC.2024.8},
  annote =	{Keywords: Relaxed locally decodable codes, Relxaed locally correctable codes, RLCC, RLDC}
}
Document
Distribution-Free Proofs of Proximity

Authors: Hugo Aaronson, Tom Gur, Ninad Rajgopal, and Ron D. Rothblum

Published in: LIPIcs, Volume 300, 39th Computational Complexity Conference (CCC 2024)


Abstract
Motivated by the fact that input distributions are often unknown in advance, distribution-free property testing considers a setting in which the algorithmic task is to accept functions f: [n] → {0,1} having a certain property Π and reject functions that are ε-far from Π, where the distance is measured according to an arbitrary and unknown input distribution 𝒟 ∼ [n]. As usual in property testing, the tester is required to do so while making only a sublinear number of input queries, but as the distribution is unknown, we also allow a sublinear number of samples from the distribution 𝒟. In this work we initiate the study of distribution-free interactive proofs of proximity (df-IPPs) in which the distribution-free testing algorithm is assisted by an all powerful but untrusted prover. Our main result is that for any problem Π ∈ NC, any proximity parameter ε > 0, and any (trade-off) parameter τ ≤ √n, we construct a df-IPP for Π with respect to ε, that has query and sample complexities τ+O(1/ε), and communication complexity Õ(n/τ + 1/ε). For τ as above and sufficiently large ε (namely, when ε > τ/n), this result matches the parameters of the best-known general purpose IPPs in the standard uniform setting. Moreover, for such τ, its parameters are optimal up to poly-logarithmic factors under reasonable cryptographic assumptions for the same regime of ε as the uniform setting, i.e., when ε ≥ 1/τ. For smaller values of ε (i.e., when ε < τ/n), our protocol has communication complexity Ω(1/ε), which is worse than the Õ(n/τ) communication complexity of the uniform IPPs (with the same query complexity). With the aim of improving on this gap, we further show that for IPPs over specialised, but large distribution families, such as sufficiently smooth distributions and product distributions, the communication complexity can be reduced to Õ(n/τ^{1-o(1)}). In addition, we show that for certain natural families of languages, such as symmetric and (relaxed) self-correctable languages, it is possible to further improve the efficiency of distribution-free IPPs.

Cite as

Hugo Aaronson, Tom Gur, Ninad Rajgopal, and Ron D. Rothblum. Distribution-Free Proofs of Proximity. In 39th Computational Complexity Conference (CCC 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 300, pp. 24:1-24:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{aaronson_et_al:LIPIcs.CCC.2024.24,
  author =	{Aaronson, Hugo and Gur, Tom and Rajgopal, Ninad and Rothblum, Ron D.},
  title =	{{Distribution-Free Proofs of Proximity}},
  booktitle =	{39th Computational Complexity Conference (CCC 2024)},
  pages =	{24:1--24:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-331-7},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{300},
  editor =	{Santhanam, Rahul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2024.24},
  URN =		{urn:nbn:de:0030-drops-204204},
  doi =		{10.4230/LIPIcs.CCC.2024.24},
  annote =	{Keywords: Property Testing, Interactive Proofs, Distribution-Free Property Testing}
}
Document
Track A: Algorithms, Complexity and Games
Linear Relaxed Locally Decodable and Correctable Codes Do Not Need Adaptivity and Two-Sided Error

Authors: Guy Goldberg

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
Relaxed locally decodable codes (RLDCs) are error-correcting codes in which individual bits of the message can be recovered by querying only a few bits from a noisy codeword. For uncorrupted codewords, and for every bit, the decoder must decode the bit correctly with high probability. However, for a noisy codeword, a relaxed local decoder is allowed to output a "rejection" symbol, indicating that the decoding failed. We study the power of adaptivity and two-sided error for RLDCs. Our main result is that if the underlying code is linear, adaptivity and two-sided error do not give any power to relaxed local decoding. We construct a reduction from adaptive, two-sided error relaxed local decoders to non-adaptive, one-sided error ones. That is, the reduction produces a relaxed local decoder that never errs or rejects if its input is a valid codeword and makes queries based on its internal randomness (and the requested index to decode), independently of the input. The reduction essentially maintains the query complexity, requiring at most one additional query. For any input, the decoder’s error probability increases at most two-fold. Furthermore, assuming the underlying code is in systematic form, where the original message is embedded as the first bits of its encoding, the reduction also conserves both the code itself and its rate and distance properties We base the reduction on our new notion of additive promise problems. A promise problem is additive if the sum of any two YES-instances is a YES-instance and the sum of any NO-instance and a YES-instance is a NO-instance. This novel framework captures both linear RLDCs and property testing (of linear properties), despite their significant differences. We prove that in general, algorithms for any additive promise problem do not gain power from adaptivity or two-sided error, and obtain the result for RLDCs as a special case. The result also holds for relaxed locally correctable codes (RLCCs), where a codeword bit should be recovered. As an application, we improve the best known lower bound for linear adaptive RLDCs. Specifically, we prove that such codes require block length of n ≥ k^{1+Ω(1/q²)}, where k denotes the message length and q denotes the number of queries.

Cite as

Guy Goldberg. Linear Relaxed Locally Decodable and Correctable Codes Do Not Need Adaptivity and Two-Sided Error. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 74:1-74:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{goldberg:LIPIcs.ICALP.2024.74,
  author =	{Goldberg, Guy},
  title =	{{Linear Relaxed Locally Decodable and Correctable Codes Do Not Need Adaptivity and Two-Sided Error}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{74:1--74:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.74},
  URN =		{urn:nbn:de:0030-drops-202174},
  doi =		{10.4230/LIPIcs.ICALP.2024.74},
  annote =	{Keywords: Locally decodable codes, Relaxed locally correctable codes, Relaxed locally decodable codes}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
On the Length of Strongly Monotone Descending Chains over ℕ^d

Authors: Sylvain Schmitz and Lia Schütze

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
A recent breakthrough by Künnemann, Mazowiecki, Schütze, Sinclair-Banks, and Węgrzycki (ICALP 2023) bounds the running time for the coverability problem in d-dimensional vector addition systems under unary encoding to n^{2^{O(d)}}, improving on Rackoff’s n^{2^{O(dlg d)}} upper bound (Theor. Comput. Sci. 1978), and provides conditional matching lower bounds. In this paper, we revisit Lazić and Schmitz' "ideal view" of the backward coverability algorithm (Inform. Comput. 2021) in the light of this breakthrough. We show that the controlled strongly monotone descending chains of downwards-closed sets over ℕ^d that arise from the dual backward coverability algorithm of Lazić and Schmitz on d-dimensional unary vector addition systems also enjoy this tight n^{2^{O(d)}} upper bound on their length, and that this also translates into the same bound on the running time of the backward coverability algorithm. Furthermore, our analysis takes place in a more general setting than that of Lazić and Schmitz, which allows to show the same results and improve on the 2EXPSPACE upper bound derived by Benedikt, Duff, Sharad, and Worrell (LICS 2017) for the coverability problem in invertible affine nets.

Cite as

Sylvain Schmitz and Lia Schütze. On the Length of Strongly Monotone Descending Chains over ℕ^d. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 153:1-153:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{schmitz_et_al:LIPIcs.ICALP.2024.153,
  author =	{Schmitz, Sylvain and Sch\"{u}tze, Lia},
  title =	{{On the Length of Strongly Monotone Descending Chains over \mathbb{N}^d}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{153:1--153:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.153},
  URN =		{urn:nbn:de:0030-drops-202964},
  doi =		{10.4230/LIPIcs.ICALP.2024.153},
  annote =	{Keywords: Vector addition system, coverability, well-quasi-order, order ideal, affine net}
}
Document
When You Come at the King You Best Not Miss

Authors: Oded Lachish, Felix Reidl, and Chhaya Trehan

Published in: LIPIcs, Volume 250, 42nd IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2022)


Abstract
A tournament is an orientation of a complete graph. We say that a vertex x in a tournament T controls another vertex y if there exists a directed path of length at most two from x to y. A vertex is called a king if it controls every vertex of the tournament. It is well known that every tournament has a king. We follow Shen, Sheng, and Wu [Jian Shen et al., 2003] in investigating the query complexity of finding a king, that is, the number of arcs in T one has to know in order to surely identify at least one vertex as a king. The aforementioned authors showed that one always has to query at least Ω(n^{4/3}) arcs and provided a strategy that queries at most O(n^{3/2}). While this upper bound has not yet been improved for the original problem, [Biswas et al., 2017] proved that with O(n^{4/3}) queries one can identify a semi-king, meaning a vertex which controls at least half of all vertices. Our contribution is a novel strategy which improves upon the number of controlled vertices: using O(n^{4/3} polylog n) queries, we can identify a (1/2+2/17)-king. To achieve this goal we use a novel structural result for tournaments.

Cite as

Oded Lachish, Felix Reidl, and Chhaya Trehan. When You Come at the King You Best Not Miss. In 42nd IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 250, pp. 25:1-25:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{lachish_et_al:LIPIcs.FSTTCS.2022.25,
  author =	{Lachish, Oded and Reidl, Felix and Trehan, Chhaya},
  title =	{{When You Come at the King You Best Not Miss}},
  booktitle =	{42nd IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2022)},
  pages =	{25:1--25:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-261-7},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{250},
  editor =	{Dawar, Anuj and Guruswami, Venkatesan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2022.25},
  URN =		{urn:nbn:de:0030-drops-174177},
  doi =		{10.4230/LIPIcs.FSTTCS.2022.25},
  annote =	{Keywords: Digraphs, tournaments, kings, query complexity}
}
Document
Improving and Extending the Testing of Distributions for Shape-Restricted Properties

Authors: Eldar Fischer, Oded Lachish, and Yadu Vasudev

Published in: LIPIcs, Volume 66, 34th Symposium on Theoretical Aspects of Computer Science (STACS 2017)


Abstract
Distribution testing deals with what information can be deduced about an unknown distribution over {1,...,n}, where the algorithm is only allowed to obtain a relatively small number of independent samples from the distribution. In the extended conditional sampling model, the algorithm is also allowed to obtain samples from the restriction of the original distribution on subsets of {1,...,n}. In 2015, Canonne, Diakonikolas, Gouleakis and Rubinfeld unified several previous results, and showed that for any property of distributions satisfying a "decomposability" criterion, there exists an algorithm (in the basic model) that can distinguish with high probability distributions satisfying the property from distributions that are far from it in variation distance. We present here a more efficient yet simpler algorithm for the basic model, as well as very efficient algorithms for the conditional model, which until now was not investigated under the umbrella of decomposable properties. Additionally, we provide an algorithm for the conditional model that handles a much larger class of properties. Our core mechanism is a way of efficiently producing an interval-partition of {1,...,n} that satisfies a "fine-grain" quality. We show that with such a partition at hand we can directly move forward with testing individual intervals, instead of first searching for the "correct" partition of {1,...,n}.

Cite as

Eldar Fischer, Oded Lachish, and Yadu Vasudev. Improving and Extending the Testing of Distributions for Shape-Restricted Properties. In 34th Symposium on Theoretical Aspects of Computer Science (STACS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 66, pp. 31:1-31:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{fischer_et_al:LIPIcs.STACS.2017.31,
  author =	{Fischer, Eldar and Lachish, Oded and Vasudev, Yadu},
  title =	{{Improving and Extending the Testing of Distributions for Shape-Restricted Properties}},
  booktitle =	{34th Symposium on Theoretical Aspects of Computer Science (STACS 2017)},
  pages =	{31:1--31:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-028-6},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{66},
  editor =	{Vollmer, Heribert and Vall\'{e}e, Brigitte},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2017.31},
  URN =		{urn:nbn:de:0030-drops-70024},
  doi =		{10.4230/LIPIcs.STACS.2017.31},
  annote =	{Keywords: conditional sampling, distribution testing, property testing, statistics}
}
Document
Two-phase Algorithms for the Parametric Shortest Path Problem

Authors: Sourav Chakraborty, Eldar Fischer, Oded Lachish, and Raphael Yuster

Published in: LIPIcs, Volume 5, 27th International Symposium on Theoretical Aspects of Computer Science (2010)


Abstract
A {\em parametric weighted graph} is a graph whose edges are labeled with continuous real functions of a single common variable. For any instantiation of the variable, one obtains a standard edge-weighted graph. Parametric weighted graph problems are generalizations of weighted graph problems, and arise in various natural scenarios. Parametric weighted graph algorithms consist of two phases. A {\em preprocessing phase} whose input is a parametric weighted graph, and whose output is a data structure, the advice, that is later used by the {\em instantiation phase}, where a specific value for the variable is given. The instantiation phase outputs the solution to the (standard) weighted graph problem that arises from the instantiation. The goal is to have the running time of the instantiation phase supersede the running time of any algorithm that solves the weighted graph problem from scratch, by taking advantage of the advice. In this paper we construct several parametric algorithms for the shortest path problem. For the case of linear function weights we present an algorithm for the single source shortest path problem. Its preprocessing phase runs in $\tilde{O}(V^4)$ time, while its instantiation phase runs in only $O(E+V \log V)$ time. The fastest standard algorithm for single source shortest path runs in $O(VE)$ time. For the case of weight functions defined by degree $d$ polynomials, we present an algorithm with quasi-polynomial preprocessing time $O(V^{(1 + \log f(d))\log V})$ and instantiation time only $\tilde{O}(V)$. In fact, for any pair of vertices $u,v$, the instantiation phase computes the distance from $u$ to $v$ in only $O(\log^2 V)$ time. Finally, for linear function weights, we present a randomized algorithm whose preprocessing time is $\tilde{O (V^{3.5})$ and so that for any pair of vertices $u,v$ and any instantiation variable, the instantiation phase computes, in $O(1)$ time, a length of a path from $u$ to $v$ that is at most (additively) $\epsilon$ larger than the length of a shortest path. In particular, an all-pairs shortest path solution, up to an additive constant error, can be computed in $O(V^2)$ time.

Cite as

Sourav Chakraborty, Eldar Fischer, Oded Lachish, and Raphael Yuster. Two-phase Algorithms for the Parametric Shortest Path Problem. In 27th International Symposium on Theoretical Aspects of Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 5, pp. 167-178, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{chakraborty_et_al:LIPIcs.STACS.2010.2452,
  author =	{Chakraborty, Sourav and Fischer, Eldar and Lachish, Oded and Yuster, Raphael},
  title =	{{Two-phase Algorithms for the Parametric Shortest Path Problem}},
  booktitle =	{27th International Symposium on Theoretical Aspects of Computer Science},
  pages =	{167--178},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-16-3},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{5},
  editor =	{Marion, Jean-Yves and Schwentick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2010.2452},
  URN =		{urn:nbn:de:0030-drops-24523},
  doi =		{10.4230/LIPIcs.STACS.2010.2452},
  annote =	{Keywords: Parametric Algorithms, Shortest path problem}
}
Document
The Covering and Boundedness Problems for Branching Vector Addition Systems

Authors: Stéphane Demri, Marcin Jurdzinski, Oded Lachish, and Ranko Lazic

Published in: LIPIcs, Volume 4, IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (2009)


Abstract
The covering and boundedness problems for branching vector addition systems are shown complete for doubly-exponential time.

Cite as

Stéphane Demri, Marcin Jurdzinski, Oded Lachish, and Ranko Lazic. The Covering and Boundedness Problems for Branching Vector Addition Systems. In IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 4, pp. 181-192, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{demri_et_al:LIPIcs.FSTTCS.2009.2317,
  author =	{Demri, St\'{e}phane and Jurdzinski, Marcin and Lachish, Oded and Lazic, Ranko},
  title =	{{The Covering and Boundedness Problems for Branching Vector Addition Systems}},
  booktitle =	{IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science},
  pages =	{181--192},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-13-2},
  ISSN =	{1868-8969},
  year =	{2009},
  volume =	{4},
  editor =	{Kannan, Ravi and Narayan Kumar, K.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2009.2317},
  URN =		{urn:nbn:de:0030-drops-23173},
  doi =		{10.4230/LIPIcs.FSTTCS.2009.2317},
  annote =	{Keywords: Vector addition systems, Petri nets, covering, boundedness, computational complexity}
}
  • Refine by Author
  • 4 Lachish, Oded
  • 2 Fischer, Eldar
  • 1 Aaronson, Hugo
  • 1 Chakraborty, Sourav
  • 1 Cohen, Gil
  • Show More...

  • Refine by Classification
  • 2 Theory of computation → Error-correcting codes
  • 1 Mathematics of computing → Combinatorial optimization
  • 1 Mathematics of computing → Graph algorithms
  • 1 Theory of computation → Interactive proof systems
  • 1 Theory of computation → Models of computation
  • Show More...

  • Refine by Keyword
  • 2 Relaxed locally decodable codes
  • 1 Digraphs
  • 1 Distribution-Free Property Testing
  • 1 Interactive Proofs
  • 1 Locally decodable codes
  • Show More...

  • Refine by Type
  • 8 document

  • Refine by Publication Year
  • 4 2024
  • 1 2009
  • 1 2010
  • 1 2017
  • 1 2022