17 Search Results for "Block, Alexander R."


Document
Matching Algorithms in the Sparse Stochastic Block Model

Authors: Anna Brandenberger, Byron Chin, Nathan S. Sheffield, and Divya Shyamal

Published in: LIPIcs, Volume 302, 35th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2024)


Abstract
In sparse Erdős-Rényi graphs, it is known that a linear-time algorithm of Karp and Sipser achieves near-optimal matching sizes asymptotically almost surely, giving a law-of-large numbers for the matching numbers of such graphs in terms of solutions to an ODE [Jonathan Aronson et al., 1998]. We provide an extension of this analysis, identifying broad ranges of stochastic block model parameters for which the Karp-Sipser algorithm achieves near-optimal matching sizes, but demonstrating that it cannot perform optimally on general stochastic block model instances. We also consider the problem of constructing a matching online, in which the vertices of one half of a bipartite stochastic block model arrive one-at-a-time, and must be matched as they arrive. We show that, when the expected degrees in all communities are equal, the competitive ratio lower bound of 0.837 found by Mastin and Jaillet for the Erdős-Rényi case [Andrew Mastin and Patrick Jaillet, 2013] is achieved by a simple greedy algorithm, and this competitive ratio is optimal. We then propose and analyze a linear-time online matching algorithm with better performance in general stochastic block models.

Cite as

Anna Brandenberger, Byron Chin, Nathan S. Sheffield, and Divya Shyamal. Matching Algorithms in the Sparse Stochastic Block Model. In 35th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 302, pp. 16:1-16:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{brandenberger_et_al:LIPIcs.AofA.2024.16,
  author =	{Brandenberger, Anna and Chin, Byron and Sheffield, Nathan S. and Shyamal, Divya},
  title =	{{Matching Algorithms in the Sparse Stochastic Block Model}},
  booktitle =	{35th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2024)},
  pages =	{16:1--16:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-329-4},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{302},
  editor =	{Mailler, C\'{e}cile and Wild, Sebastian},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.AofA.2024.16},
  URN =		{urn:nbn:de:0030-drops-204515},
  doi =		{10.4230/LIPIcs.AofA.2024.16},
  annote =	{Keywords: Matching Algorithms, Online Matching, Stochastic Block Model}
}
Document
Asymptotically-Good RLCCs with (log n)^(2+o(1)) Queries

Authors: Gil Cohen and Tal Yankovitz

Published in: LIPIcs, Volume 300, 39th Computational Complexity Conference (CCC 2024)


Abstract
Recently, Kumar and Mon reached a significant milestone by constructing asymptotically good relaxed locally correctable codes (RLCCs) with poly-logarithmic query complexity. Specifically, they constructed n-bit RLCCs with O(log^{69} n) queries. Their construction relies on a clever reduction to locally testable codes (LTCs), capitalizing on recent breakthrough works in LTCs. As for lower bounds, Gur and Lachish (SICOMP 2021) proved that any asymptotically-good RLCC must make Ω̃(√{log n}) queries. Hence emerges the intriguing question regarding the identity of the least value 1/2 ≤ e ≤ 69 for which asymptotically-good RLCCs with query complexity (log n)^{e+o(1)} exist. In this work, we make substantial progress in narrowing the gap by devising asymptotically-good RLCCs with a query complexity of (log n)^{2+o(1)}. The key insight driving our work lies in recognizing that the strong guarantee of local testability overshoots the requirements for the Kumar-Mon reduction. In particular, we prove that we can replace the LTCs by "vanilla" expander codes which indeed have the necessary property: local testability in the code’s vicinity.

Cite as

Gil Cohen and Tal Yankovitz. Asymptotically-Good RLCCs with (log n)^(2+o(1)) Queries. In 39th Computational Complexity Conference (CCC 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 300, pp. 8:1-8:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{cohen_et_al:LIPIcs.CCC.2024.8,
  author =	{Cohen, Gil and Yankovitz, Tal},
  title =	{{Asymptotically-Good RLCCs with (log n)^(2+o(1)) Queries}},
  booktitle =	{39th Computational Complexity Conference (CCC 2024)},
  pages =	{8:1--8:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-331-7},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{300},
  editor =	{Santhanam, Rahul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2024.8},
  URN =		{urn:nbn:de:0030-drops-204045},
  doi =		{10.4230/LIPIcs.CCC.2024.8},
  annote =	{Keywords: Relaxed locally decodable codes, Relxaed locally correctable codes, RLCC, RLDC}
}
Document
Lifting Dichotomies

Authors: Yaroslav Alekseev, Yuval Filmus, and Alexander Smal

Published in: LIPIcs, Volume 300, 39th Computational Complexity Conference (CCC 2024)


Abstract
Lifting theorems are used for transferring lower bounds between Boolean function complexity measures. Given a lower bound on a complexity measure A for some function f, we compose f with a carefully chosen gadget function g and get essentially the same lower bound on a complexity measure B for the lifted function f ⋄ g. Lifting theorems have a number of applications in many different areas such as circuit complexity, communication complexity, proof complexity, etc. One of the main question in the context of lifting is how to choose a suitable gadget g. Generally, to get better results, i.e., to minimize the losses when transferring lower bounds, we need the gadget to be of a constant size (number of inputs). Unfortunately, in many settings we know lifting results only for gadgets of size that grows with the size of f, and it is unclear whether it can be improved to a constant size gadget. This motivates us to identify the properties of gadgets that make lifting possible. In this paper, we systematically study the question "For which gadgets does the lifting result hold?" in the following four settings: lifting from decision tree depth to decision tree size, lifting from conjunction DAG width to conjunction DAG size, lifting from decision tree depth to parity decision tree depth and size, and lifting from block sensitivity to deterministic and randomized communication complexities. In all the cases, we prove the complete classification of gadgets by exposing the properties of gadgets that make lifting results hold. The structure of the results shows that there is no intermediate cases - for every gadget there is either a polynomial lifting or no lifting at all. As a byproduct of our studies, we prove the log-rank conjecture for the class of functions that can be represented as f ⋄ OR ⋄ XOR for some function f. In this extended abstract, the proofs are omitted. Full proofs are given in the full version [Yaroslav Alekseev et al., 2024].

Cite as

Yaroslav Alekseev, Yuval Filmus, and Alexander Smal. Lifting Dichotomies. In 39th Computational Complexity Conference (CCC 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 300, pp. 9:1-9:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{alekseev_et_al:LIPIcs.CCC.2024.9,
  author =	{Alekseev, Yaroslav and Filmus, Yuval and Smal, Alexander},
  title =	{{Lifting Dichotomies}},
  booktitle =	{39th Computational Complexity Conference (CCC 2024)},
  pages =	{9:1--9:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-331-7},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{300},
  editor =	{Santhanam, Rahul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2024.9},
  URN =		{urn:nbn:de:0030-drops-204051},
  doi =		{10.4230/LIPIcs.CCC.2024.9},
  annote =	{Keywords: decision trees, log-rank conjecture, lifting, parity decision trees}
}
Document
Two-Dimensional Kripke Semantics I: Presheaves

Authors: G. A. Kavvos

Published in: LIPIcs, Volume 299, 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)


Abstract
The study of modal logic has witnessed tremendous development following the introduction of Kripke semantics. However, recent developments in programming languages and type theory have led to a second way of studying modalities, namely through their categorical semantics. We show how the two correspond.

Cite as

G. A. Kavvos. Two-Dimensional Kripke Semantics I: Presheaves. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 14:1-14:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{kavvos:LIPIcs.FSCD.2024.14,
  author =	{Kavvos, G. A.},
  title =	{{Two-Dimensional Kripke Semantics I: Presheaves}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{14:1--14:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.14},
  URN =		{urn:nbn:de:0030-drops-203438},
  doi =		{10.4230/LIPIcs.FSCD.2024.14},
  annote =	{Keywords: modal logic, categorical semantics, Kripke semantics, duality, open maps}
}
Document
Track A: Algorithms, Complexity and Games
Learning Low-Degree Quantum Objects

Authors: Srinivasan Arunachalam, Arkopal Dutt, Francisco Escudero Gutiérrez, and Carlos Palazuelos

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
We consider the problem of learning low-degree quantum objects up to ε-error in 𝓁₂-distance. We show the following results: (i) unknown n-qubit degree-d (in the Pauli basis) quantum channels and unitaries can be learned using O(1/ε^d) queries (which is independent of n), (ii) polynomials p:{-1,1}ⁿ → [-1,1] arising from d-query quantum algorithms can be learned from O((1/ε)^d ⋅ log n) many random examples (x,p(x)) (which implies learnability even for d = O(log n)), and (iii) degree-d polynomials p:{-1,1}ⁿ → [-1,1] can be learned through O(1/ε^d) queries to a quantum unitary U_p that block-encodes p. Our main technical contributions are new Bohnenblust-Hille inequalities for quantum channels and completely bounded polynomials.

Cite as

Srinivasan Arunachalam, Arkopal Dutt, Francisco Escudero Gutiérrez, and Carlos Palazuelos. Learning Low-Degree Quantum Objects. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 13:1-13:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{arunachalam_et_al:LIPIcs.ICALP.2024.13,
  author =	{Arunachalam, Srinivasan and Dutt, Arkopal and Escudero Guti\'{e}rrez, Francisco and Palazuelos, Carlos},
  title =	{{Learning Low-Degree Quantum Objects}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{13:1--13:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.13},
  URN =		{urn:nbn:de:0030-drops-201563},
  doi =		{10.4230/LIPIcs.ICALP.2024.13},
  annote =	{Keywords: Tomography}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
An Efficient Quantifier Elimination Procedure for Presburger Arithmetic

Authors: Christoph Haase, Shankara Narayanan Krishna, Khushraj Madnani, Om Swostik Mishra, and Georg Zetzsche

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
All known quantifier elimination procedures for Presburger arithmetic require doubly exponential time for eliminating a single block of existentially quantified variables. It has even been claimed in the literature that this upper bound is tight. We observe that this claim is incorrect and develop, as the main result of this paper, a quantifier elimination procedure eliminating a block of existentially quantified variables in singly exponential time. As corollaries, we can establish the precise complexity of numerous problems. Examples include deciding (i) monadic decomposability for existential formulas, (ii) whether an existential formula defines a well-quasi ordering or, more generally, (iii) certain formulas of Presburger arithmetic with Ramsey quantifiers. Moreover, despite the exponential blowup, our procedure shows that under mild assumptions, even NP upper bounds for decision problems about quantifier-free formulas can be transferred to existential formulas. The technical basis of our results is a kind of small model property for parametric integer programming that generalizes the seminal results by von zur Gathen and Sieveking on small integer points in convex polytopes.

Cite as

Christoph Haase, Shankara Narayanan Krishna, Khushraj Madnani, Om Swostik Mishra, and Georg Zetzsche. An Efficient Quantifier Elimination Procedure for Presburger Arithmetic. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 142:1-142:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{haase_et_al:LIPIcs.ICALP.2024.142,
  author =	{Haase, Christoph and Krishna, Shankara Narayanan and Madnani, Khushraj and Mishra, Om Swostik and Zetzsche, Georg},
  title =	{{An Efficient Quantifier Elimination Procedure for Presburger Arithmetic}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{142:1--142:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.142},
  URN =		{urn:nbn:de:0030-drops-202856},
  doi =		{10.4230/LIPIcs.ICALP.2024.142},
  annote =	{Keywords: Presburger arithmetic, quantifier elimination, parametric integer programming, convex geometry}
}
Document
Detecting Causality in the Presence of Byzantine Processes: The Synchronous Systems Case

Authors: Anshuman Misra and Ajay D. Kshemkalyani

Published in: LIPIcs, Volume 278, 30th International Symposium on Temporal Representation and Reasoning (TIME 2023)


Abstract
Detecting causality or the happens before relation between events in a distributed system is a fundamental building block for distributed applications. It was recently proved that this problem cannot be solved in an asynchronous distributed system in the presence of Byzantine processes, irrespective of whether the communication mechanism is via unicasts, multicasts, or broadcasts. In light of this impossibility result, we turn attention to synchronous systems and examine the possibility of solving the causality detection problem in such systems. In this paper, we prove that causality detection between events can be solved in the presence of Byzantine processes in a synchronous distributed system. The positive result holds for unicast, multicast, as well as broadcast modes of communication. We prove the result by providing an algorithm. Our solution uses the Replicated State Machine (RSM) approach and vector clocks.

Cite as

Anshuman Misra and Ajay D. Kshemkalyani. Detecting Causality in the Presence of Byzantine Processes: The Synchronous Systems Case. In 30th International Symposium on Temporal Representation and Reasoning (TIME 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 278, pp. 11:1-11:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{misra_et_al:LIPIcs.TIME.2023.11,
  author =	{Misra, Anshuman and Kshemkalyani, Ajay D.},
  title =	{{Detecting Causality in the Presence of Byzantine Processes: The Synchronous Systems Case}},
  booktitle =	{30th International Symposium on Temporal Representation and Reasoning (TIME 2023)},
  pages =	{11:1--11:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-298-3},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{278},
  editor =	{Artikis, Alexander and Bruse, Florian and Hunsberger, Luke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TIME.2023.11},
  URN =		{urn:nbn:de:0030-drops-191017},
  doi =		{10.4230/LIPIcs.TIME.2023.11},
  annote =	{Keywords: Byzantine fault-tolerance, causality, happens before, distributed system, message-passing, synchronous system}
}
Document
Extended Abstract
Answer Set Automata: A Learnable Pattern Specification Framework for Complex Event Recognition (Extended Abstract)

Authors: Nikos Katzouris and Georgios Paliouras

Published in: LIPIcs, Volume 278, 30th International Symposium on Temporal Representation and Reasoning (TIME 2023)


Abstract
Complex Event Recognition (CER) systems detect event occurrences in streaming input using predefined event patterns. Techniques that learn event patterns from data are highly desirable in CER. Since such patterns are typically represented by symbolic automata, we propose a family of such automata where the transition-enabling conditions are defined by Answer Set Programming (ASP) rules, and which, thanks to the strong connections of ASP to symbolic learning, are learnable from data. We present such a learning approach in ASP, capable of jointly learning the structure of an automaton and its transition guards' definitions from building-block predicates, and a scalable, incremental version thereof that progressively revises models learnt from mini-batches using Monte Carlo Tree Search. We evaluate our approach on three CER datasets and empirically demonstrate its efficacy.

Cite as

Nikos Katzouris and Georgios Paliouras. Answer Set Automata: A Learnable Pattern Specification Framework for Complex Event Recognition (Extended Abstract). In 30th International Symposium on Temporal Representation and Reasoning (TIME 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 278, pp. 17:1-17:3, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{katzouris_et_al:LIPIcs.TIME.2023.17,
  author =	{Katzouris, Nikos and Paliouras, Georgios},
  title =	{{Answer Set Automata: A Learnable Pattern Specification Framework for Complex Event Recognition}},
  booktitle =	{30th International Symposium on Temporal Representation and Reasoning (TIME 2023)},
  pages =	{17:1--17:3},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-298-3},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{278},
  editor =	{Artikis, Alexander and Bruse, Florian and Hunsberger, Luke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TIME.2023.17},
  URN =		{urn:nbn:de:0030-drops-191071},
  doi =		{10.4230/LIPIcs.TIME.2023.17},
  annote =	{Keywords: Event Pattern Learning, Answer Set Programming}
}
Document
A Graph-Theoretic Formulation of Exploratory Blockmodeling

Authors: Alexander Bille, Niels Grüttemeier, Christian Komusiewicz, and Nils Morawietz

Published in: LIPIcs, Volume 265, 21st International Symposium on Experimental Algorithms (SEA 2023)


Abstract
We present a new simple graph-theoretic formulation of the exploratory blockmodeling problem on undirected and unweighted one-mode networks. Our formulation takes as input the network G and the maximum number t of blocks for the solution model. The task is to find a minimum-size set of edge insertions and deletions that transform the input graph G into a graph G' with at most t neighborhood classes. Herein, a neighborhood class is a maximal set of vertices with the same neighborhood. The neighborhood classes of G' directly give the blocks and block interactions of the computed blockmodel. We analyze the classic and parameterized complexity of the exploratory blockmodeling problem, provide a branch-and-bound algorithm, an ILP formulation and several heuristics. Finally, we compare our exact algorithms to previous ILP-based approaches and show that the new algorithms are faster for t ≥ 4.

Cite as

Alexander Bille, Niels Grüttemeier, Christian Komusiewicz, and Nils Morawietz. A Graph-Theoretic Formulation of Exploratory Blockmodeling. In 21st International Symposium on Experimental Algorithms (SEA 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 265, pp. 14:1-14:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bille_et_al:LIPIcs.SEA.2023.14,
  author =	{Bille, Alexander and Gr\"{u}ttemeier, Niels and Komusiewicz, Christian and Morawietz, Nils},
  title =	{{A Graph-Theoretic Formulation of Exploratory Blockmodeling}},
  booktitle =	{21st International Symposium on Experimental Algorithms (SEA 2023)},
  pages =	{14:1--14:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-279-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{265},
  editor =	{Georgiadis, Loukas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SEA.2023.14},
  URN =		{urn:nbn:de:0030-drops-183648},
  doi =		{10.4230/LIPIcs.SEA.2023.14},
  annote =	{Keywords: Clustering, Exact Algorithms, ILP-Formulation, Branch-and-Bound, Social Networks}
}
Document
On Relaxed Locally Decodable Codes for Hamming and Insertion-Deletion Errors

Authors: Alexander R. Block, Jeremiah Blocki, Kuan Cheng, Elena Grigorescu, Xin Li, Yu Zheng, and Minshen Zhu

Published in: LIPIcs, Volume 264, 38th Computational Complexity Conference (CCC 2023)


Abstract
Locally Decodable Codes (LDCs) are error-correcting codes C:Σⁿ → Σ^m, encoding messages in Σⁿ to codewords in Σ^m, with super-fast decoding algorithms. They are important mathematical objects in many areas of theoretical computer science, yet the best constructions so far have codeword length m that is super-polynomial in n, for codes with constant query complexity and constant alphabet size. In a very surprising result, Ben-Sasson, Goldreich, Harsha, Sudan, and Vadhan (SICOMP 2006) show how to construct a relaxed version of LDCs (RLDCs) with constant query complexity and almost linear codeword length over the binary alphabet, and used them to obtain significantly-improved constructions of Probabilistically Checkable Proofs. In this work, we study RLDCs in the standard Hamming-error setting, and introduce their variants in the insertion and deletion (Insdel) error setting. Standard LDCs for Insdel errors were first studied by Ostrovsky and Paskin-Cherniavsky (Information Theoretic Security, 2015), and are further motivated by recent advances in DNA random access bio-technologies. Our first result is an exponential lower bound on the length of Hamming RLDCs making 2 queries (even adaptively), over the binary alphabet. This answers a question explicitly raised by Gur and Lachish (SICOMP 2021) and is the first exponential lower bound for RLDCs. Combined with the results of Ben-Sasson et al., our result exhibits a "phase-transition"-type behavior on the codeword length for some constant-query complexity. We achieve these lower bounds via a transformation of RLDCs to standard Hamming LDCs, using a careful analysis of restrictions of message bits that fix codeword bits. We further define two variants of RLDCs in the Insdel-error setting, a weak and a strong version. On the one hand, we construct weak Insdel RLDCs with almost linear codeword length and constant query complexity, matching the parameters of the Hamming variants. On the other hand, we prove exponential lower bounds for strong Insdel RLDCs. These results demonstrate that, while these variants are equivalent in the Hamming setting, they are significantly different in the insdel setting. Our results also prove a strict separation between Hamming RLDCs and Insdel RLDCs.

Cite as

Alexander R. Block, Jeremiah Blocki, Kuan Cheng, Elena Grigorescu, Xin Li, Yu Zheng, and Minshen Zhu. On Relaxed Locally Decodable Codes for Hamming and Insertion-Deletion Errors. In 38th Computational Complexity Conference (CCC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 264, pp. 14:1-14:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{block_et_al:LIPIcs.CCC.2023.14,
  author =	{Block, Alexander R. and Blocki, Jeremiah and Cheng, Kuan and Grigorescu, Elena and Li, Xin and Zheng, Yu and Zhu, Minshen},
  title =	{{On Relaxed Locally Decodable Codes for Hamming and Insertion-Deletion Errors}},
  booktitle =	{38th Computational Complexity Conference (CCC 2023)},
  pages =	{14:1--14:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-282-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{264},
  editor =	{Ta-Shma, Amnon},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2023.14},
  URN =		{urn:nbn:de:0030-drops-182847},
  doi =		{10.4230/LIPIcs.CCC.2023.14},
  annote =	{Keywords: Relaxed Locally Decodable Codes, Hamming Errors, Insdel Errors, Lower Bounds}
}
Document
A Fast Data Structure for Dynamic Graphs Based on Hash-Indexed Adjacency Blocks

Authors: Alexander van der Grinten, Maria Predari, and Florian Willich

Published in: LIPIcs, Volume 233, 20th International Symposium on Experimental Algorithms (SEA 2022)


Abstract
Several dynamic graph data structures have been proposed in literature. Yet, these data structures either offer limited support for arbitrary graph algorithms or they are designed as part of specific frameworks (e.g., for GPUs or specialized hardware). Such frameworks are difficult to adopt to arbitrary graph computations and lead practitioners to fall back to less sophisticated solutions when dealing with dynamic graphs. In this work, we propose a new "dynamic hashed blocks" (DHB) data structure for sparse dynamic graphs and matrices on general-purpose CPU architectures. DHB combines an efficient block-based memory layout to store incident edges with an additional per-vertex hash index for high degree vertices. This hash index allows us to quickly insert edges without introducing duplicates, while the block-based memory layout retains advantageous cache locality properties of traditional adjacency arrays. Experiments show that DHB outperforms competing dynamic graph structures for edge insertions, updates, deletions, and traversal operations. Compared to static CSR layouts, DHB exhibits only a small overhead in traversal performance. DHB’s interface is similar to general-purpose abstract graph data types and can be easily used as a drop-in replacement for traditional adjacency arrays. To demonstrate that, we modify the well-known NetworKit framework to use DHB instead of its own dynamic graph representation. Experiments show that this modification only slightly penalizes the performance of graph algorithms while considerably boosting update rates.

Cite as

Alexander van der Grinten, Maria Predari, and Florian Willich. A Fast Data Structure for Dynamic Graphs Based on Hash-Indexed Adjacency Blocks. In 20th International Symposium on Experimental Algorithms (SEA 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 233, pp. 11:1-11:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{vandergrinten_et_al:LIPIcs.SEA.2022.11,
  author =	{van der Grinten, Alexander and Predari, Maria and Willich, Florian},
  title =	{{A Fast Data Structure for Dynamic Graphs Based on Hash-Indexed Adjacency Blocks}},
  booktitle =	{20th International Symposium on Experimental Algorithms (SEA 2022)},
  pages =	{11:1--11:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-251-8},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{233},
  editor =	{Schulz, Christian and U\c{c}ar, Bora},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SEA.2022.11},
  URN =		{urn:nbn:de:0030-drops-165453},
  doi =		{10.4230/LIPIcs.SEA.2022.11},
  annote =	{Keywords: dynamic graph data structures, sparse matrix layout, dynamic algorithms, parallel algorithms, graph analysis}
}
Document
P₄-free Partition and Cover Numbers & Applications

Authors: Alexander R. Block, Simina Brânzei, Hemanta K. Maji, Himanshi Mehta, Tamalika Mukherjee, and Hai H. Nguyen

Published in: LIPIcs, Volume 199, 2nd Conference on Information-Theoretic Cryptography (ITC 2021)


Abstract
P₄-free graphs- also known as cographs, complement-reducible graphs, or hereditary Dacey graphs-have been well studied in graph theory. Motivated by computer science and information theory applications, our work encodes (flat) joint probability distributions and Boolean functions as bipartite graphs and studies bipartite P₄-free graphs. For these applications, the graph properties of edge partitioning and covering a bipartite graph using the minimum number of these graphs are particularly relevant. Previously, such graph properties have appeared in leakage-resilient cryptography and (variants of) coloring problems. Interestingly, our covering problem is closely related to the well-studied problem of product (a.k.a., Prague) dimension of loopless undirected graphs, which allows us to employ algebraic lower-bounding techniques for the product/Prague dimension. We prove that computing these numbers is NP-complete, even for bipartite graphs. We establish a connection to the (unsolved) Zarankiewicz problem to show that there are bipartite graphs with size-N partite sets such that these numbers are at least ε⋅N^{1-2ε}, for ε ∈ {1/3,1/4,1/5,...}. Finally, we accurately estimate these numbers for bipartite graphs encoding well-studied Boolean functions from circuit complexity, such as set intersection, set disjointness, and inequality. For applications in information theory and communication & cryptographic complexity, we consider a system where a setup samples from a (flat) joint distribution and gives the participants, Alice and Bob, their portion from this joint sample. Alice and Bob’s objective is to non-interactively establish a shared key and extract the left-over entropy from their portion of the samples as independent private randomness. A genie, who observes the joint sample, provides appropriate assistance to help Alice and Bob with their objective. Lower bounds to the minimum size of the genie’s assistance translate into communication and cryptographic lower bounds. We show that (the log₂ of) the P₄-free partition number of a graph encoding the joint distribution that the setup uses is equivalent to the size of the genie’s assistance. Consequently, the joint distributions corresponding to the bipartite graphs constructed above with high P₄-free partition numbers correspond to joint distributions requiring more assistance from the genie. As a representative application in non-deterministic communication complexity, we study the communication complexity of nondeterministic protocols augmented by access to the equality oracle at the output. We show that (the log₂ of) the P₄-free cover number of the bipartite graph encoding a Boolean function f is equivalent to the minimum size of the nondeterministic input required by the parties (referred to as the communication complexity of f in this model). Consequently, the functions corresponding to the bipartite graphs with high P₄-free cover numbers have high communication complexity. Furthermore, there are functions with communication complexity close to the naïve protocol where the nondeterministic input reveals a party’s input. Finally, the access to the equality oracle reduces the communication complexity of computing set disjointness by a constant factor in contrast to the model where parties do not have access to the equality oracle. To compute the inequality function, we show an exponential reduction in the communication complexity, and this bound is optimal. On the other hand, access to the equality oracle is (nearly) useless for computing set intersection.

Cite as

Alexander R. Block, Simina Brânzei, Hemanta K. Maji, Himanshi Mehta, Tamalika Mukherjee, and Hai H. Nguyen. P₄-free Partition and Cover Numbers & Applications. In 2nd Conference on Information-Theoretic Cryptography (ITC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 199, pp. 16:1-16:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{block_et_al:LIPIcs.ITC.2021.16,
  author =	{Block, Alexander R. and Br\^{a}nzei, Simina and Maji, Hemanta K. and Mehta, Himanshi and Mukherjee, Tamalika and Nguyen, Hai H.},
  title =	{{P₄-free Partition and Cover Numbers \& Applications}},
  booktitle =	{2nd Conference on Information-Theoretic Cryptography (ITC 2021)},
  pages =	{16:1--16:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-197-9},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{199},
  editor =	{Tessaro, Stefano},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITC.2021.16},
  URN =		{urn:nbn:de:0030-drops-143357},
  doi =		{10.4230/LIPIcs.ITC.2021.16},
  annote =	{Keywords: Secure keys, Secure private randomness, Gray-Wyner system, Cryptographic complexity, Nondeterministic communication complexity, Leakage-resilience, Combinatorial optimization, Product dimension, Zarankiewicz problem, Algebraic lower-bounding techniques, P₄-free partition number, P₄-free cover number}
}
Document
Invited Talk
Byzantine Agreement and SMR with Sub-Quadratic Message Complexity (Invited Talk)

Authors: Idit Keidar

Published in: LIPIcs, Volume 184, 24th International Conference on Principles of Distributed Systems (OPODIS 2020)


Abstract
Byzantine Agreement (BA) has been studied for four decades by now, but until recently, has been considered at a fairly small scale. In recent years, however, we begin to see practical use-cases of BA in large-scale systems, which motivates a push for reduced communication complexity. Dolev and Reischuk’s well-known lower bound stipulates that any deterministic algorithm requires Ω(n²) communication in the worst-case, and until fairly recently, almost all randomized algorithms have had at least quadratic complexity as well. This talk will present two new algorithms breaking this barrier. The first part of the talk will consider a fully asynchronous setting, focusing on randomized BA whose safety and liveness guarantees hold with high probability. It will present the first asynchronous Byzantine Agreement algorithm with sub-quadratic communication complexity. This algorithm exploits VRF-based committee sampling, which it adapts for the asynchronous model. The second part of the talk will consider the eventually synchronous model, where BA and State Machine Replication (SMR) can be solved with deterministic safety and liveness guarantees. In this context, randomization is used in order to reduce the expected communication complexity. The talk will present an algorithm for round synchronization, which is a building block for BA and SMR and constitutes the main performance bottleneck therein. It will present an algorithm that, for the first time, achieves round synchronization with expected linear message complexity and expected constant latency. Existing protocols can use this round synchronization algorithm to solve Byzantine SMR with the same asymptotic performance. The first part of the talk is based on joint work with Shir Cohen and Alexander Spiegelman, and the second part of the talk is based on joint work with Oded Naor.

Cite as

Idit Keidar. Byzantine Agreement and SMR with Sub-Quadratic Message Complexity (Invited Talk). In 24th International Conference on Principles of Distributed Systems (OPODIS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 184, p. 2:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{keidar:LIPIcs.OPODIS.2020.2,
  author =	{Keidar, Idit},
  title =	{{Byzantine Agreement and SMR with Sub-Quadratic Message Complexity}},
  booktitle =	{24th International Conference on Principles of Distributed Systems (OPODIS 2020)},
  pages =	{2:1--2:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-176-4},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{184},
  editor =	{Bramas, Quentin and Oshman, Rotem and Romano, Paolo},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.OPODIS.2020.2},
  URN =		{urn:nbn:de:0030-drops-134874},
  doi =		{10.4230/LIPIcs.OPODIS.2020.2},
  annote =	{Keywords: Distributed Computing, Byzantine Agreement}
}
Document
ACE: Abstract Consensus Encapsulation for Liveness Boosting of State Machine Replication

Authors: Alexander Spiegelman, Arik Rinberg, and Dahlia Malkhi

Published in: LIPIcs, Volume 184, 24th International Conference on Principles of Distributed Systems (OPODIS 2020)


Abstract
With the emergence of attack-prone cross-organization systems, providing asynchronous state machine replication (SMR) solutions is no longer a theoretical concern. This paper presents ACE, a framework for the design of such fault tolerant systems. Leveraging a known paradigm for randomized consensus solutions, ACE wraps existing practical solutions and real-life systems, boosting their liveness under adversarial conditions and, at the same time, promoting load balancing and fairness. Boosting is achieved without modifying the overall design or the engineering of these solutions. ACE is aimed at boosting the prevailing approach for practical fault tolerance. This approach, often named partial synchrony, is based on a leader-based paradigm: a good leader makes progress and a bad leader does no harm. The partial synchrony approach focuses on safety and forgoes liveness under targeted and dynamic attacks. Specifically, an attacker might block specific leaders, e.g., through a denial of service, to prevent progress. ACE provides boosting by running waves of parallel leaders and selecting a winning leader only retroactively, achieving boosting at a linear communication cost increase. ACE is agnostic to the fault model, inheriting it s failure model from the wrapped solution assumptions. As our evaluation shows, an asynchronous Byzantine fault tolerance (BFT) replication system built with ACE around an existing partially synchronous BFT protocol demonstrates reasonable slow-down compared with the base BFT protocol during faultless synchronous scenarios, yet exhibits significant speedup while the system is under attack.

Cite as

Alexander Spiegelman, Arik Rinberg, and Dahlia Malkhi. ACE: Abstract Consensus Encapsulation for Liveness Boosting of State Machine Replication. In 24th International Conference on Principles of Distributed Systems (OPODIS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 184, pp. 9:1-9:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{spiegelman_et_al:LIPIcs.OPODIS.2020.9,
  author =	{Spiegelman, Alexander and Rinberg, Arik and Malkhi, Dahlia},
  title =	{{ACE: Abstract Consensus Encapsulation for Liveness Boosting of State Machine Replication}},
  booktitle =	{24th International Conference on Principles of Distributed Systems (OPODIS 2020)},
  pages =	{9:1--9:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-176-4},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{184},
  editor =	{Bramas, Quentin and Oshman, Rotem and Romano, Paolo},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.OPODIS.2020.9},
  URN =		{urn:nbn:de:0030-drops-134948},
  doi =		{10.4230/LIPIcs.OPODIS.2020.9},
  annote =	{Keywords: Framework, Asynchronous, Consensus boosting, State Machine Replication}
}
Document
Locally Decodable/Correctable Codes for Insertions and Deletions

Authors: Alexander R. Block, Jeremiah Blocki, Elena Grigorescu, Shubhang Kulkarni, and Minshen Zhu

Published in: LIPIcs, Volume 182, 40th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2020)


Abstract
Recent efforts in coding theory have focused on building codes for insertions and deletions, called insdel codes, with optimal trade-offs between their redundancy and their error-correction capabilities, as well as efficient encoding and decoding algorithms. In many applications, polynomial running time may still be prohibitively expensive, which has motivated the study of codes with super-efficient decoding algorithms. These have led to the well-studied notions of Locally Decodable Codes (LDCs) and Locally Correctable Codes (LCCs). Inspired by these notions, Ostrovsky and Paskin-Cherniavsky (Information Theoretic Security, 2015) generalized Hamming LDCs to insertions and deletions. To the best of our knowledge, these are the only known results that study the analogues of Hamming LDCs in channels performing insertions and deletions. Here we continue the study of insdel codes that admit local algorithms. Specifically, we reprove the results of Ostrovsky and Paskin-Cherniavsky for insdel LDCs using a different set of techniques. We also observe that the techniques extend to constructions of LCCs. Specifically, we obtain insdel LDCs and LCCs from their Hamming LDCs and LCCs analogues, respectively. The rate and error-correction capability blow up only by a constant factor, while the query complexity blows up by a poly log factor in the block length. Since insdel locally decodable/correctble codes are scarcely studied in the literature, we believe our results and techniques may lead to further research. In particular, we conjecture that constant-query insdel LDCs/LCCs do not exist.

Cite as

Alexander R. Block, Jeremiah Blocki, Elena Grigorescu, Shubhang Kulkarni, and Minshen Zhu. Locally Decodable/Correctable Codes for Insertions and Deletions. In 40th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 182, pp. 16:1-16:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{block_et_al:LIPIcs.FSTTCS.2020.16,
  author =	{Block, Alexander R. and Blocki, Jeremiah and Grigorescu, Elena and Kulkarni, Shubhang and Zhu, Minshen},
  title =	{{Locally Decodable/Correctable Codes for Insertions and Deletions}},
  booktitle =	{40th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2020)},
  pages =	{16:1--16:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-174-0},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{182},
  editor =	{Saxena, Nitin and Simon, Sunil},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2020.16},
  URN =		{urn:nbn:de:0030-drops-132577},
  doi =		{10.4230/LIPIcs.FSTTCS.2020.16},
  annote =	{Keywords: Locally decodable/correctable codes, insert-delete channel}
}
  • Refine by Author
  • 3 Block, Alexander R.
  • 2 Blocki, Jeremiah
  • 2 Grigorescu, Elena
  • 2 Zhu, Minshen
  • 1 Alekseev, Yaroslav
  • Show More...

  • Refine by Classification
  • 3 Theory of computation → Error-correcting codes
  • 2 Computing methodologies → Distributed algorithms
  • 2 Networks → Network algorithms
  • 2 Theory of computation → Communication complexity
  • 1 Computing methodologies → Logic programming and answer set programming
  • Show More...

  • Refine by Keyword
  • 1 Algebraic lower-bounding techniques
  • 1 Answer Set Programming
  • 1 Asynchronous
  • 1 Block complexity
  • 1 Branch-and-Bound
  • Show More...

  • Refine by Type
  • 17 document

  • Refine by Publication Year
  • 6 2024
  • 4 2023
  • 3 2021
  • 1 2016
  • 1 2019
  • Show More...