224 Search Results for "Ol�h, D�niel"


Document
Recent Trends in Graph Decomposition (Dagstuhl Seminar 23331)

Authors: George Karypis, Christian Schulz, Darren Strash, Deepak Ajwani, Rob H. Bisseling, Katrin Casel, Ümit V. Çatalyürek, Cédric Chevalier, Florian Chudigiewitsch, Marcelo Fonseca Faraj, Michael Fellows, Lars Gottesbüren, Tobias Heuer, Kamer Kaya, Jakub Lacki, Johannes Langguth, Xiaoye Sherry Li, Ruben Mayer, Johannes Meintrup, Yosuke Mizutani, François Pellegrini, Fabrizio Petrini, Frances Rosamond, Ilya Safro, Sebastian Schlag, Roohani Sharma, Blair D. Sullivan, Bora Uçar, and Albert-Jan Yzelman

Published in: Dagstuhl Reports, Volume 13, Issue 8 (2024)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 23331 "Recent Trends in Graph Decomposition", which took place from 13. August to 18. August, 2023. The seminar brought together 33 experts from academia and industry to discuss graph decomposition, a pivotal technique for handling massive graphs in applications such as social networks and scientific simulations. The seminar addressed the challenges posed by contemporary hardware designs, the potential of deep neural networks and reinforcement learning in developing heuristics, the unique optimization requirements of large sparse data, and the need for scalable algorithms suitable for emerging architectures. Through presentations, discussions, and collaborative sessions, the event fostered an exchange of innovative ideas, leading to the creation of community notes highlighting key open problems in the field.

Cite as

George Karypis, Christian Schulz, Darren Strash, Deepak Ajwani, Rob H. Bisseling, Katrin Casel, Ümit V. Çatalyürek, Cédric Chevalier, Florian Chudigiewitsch, Marcelo Fonseca Faraj, Michael Fellows, Lars Gottesbüren, Tobias Heuer, Kamer Kaya, Jakub Lacki, Johannes Langguth, Xiaoye Sherry Li, Ruben Mayer, Johannes Meintrup, Yosuke Mizutani, François Pellegrini, Fabrizio Petrini, Frances Rosamond, Ilya Safro, Sebastian Schlag, Roohani Sharma, Blair D. Sullivan, Bora Uçar, and Albert-Jan Yzelman. Recent Trends in Graph Decomposition (Dagstuhl Seminar 23331). In Dagstuhl Reports, Volume 13, Issue 8, pp. 1-45, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@Article{karypis_et_al:DagRep.13.8.1,
  author =	{Karypis, George and Schulz, Christian and Strash, Darren and Ajwani, Deepak and Bisseling, Rob H. and Casel, Katrin and \c{C}ataly\"{u}rek, \"{U}mit V. and Chevalier, C\'{e}dric and Chudigiewitsch, Florian and Faraj, Marcelo Fonseca and Fellows, Michael and Gottesb\"{u}ren, Lars and Heuer, Tobias and Kaya, Kamer and Lacki, Jakub and Langguth, Johannes and Li, Xiaoye Sherry and Mayer, Ruben and Meintrup, Johannes and Mizutani, Yosuke and Pellegrini, Fran\c{c}ois and Petrini, Fabrizio and Rosamond, Frances and Safro, Ilya and Schlag, Sebastian and Sharma, Roohani and Sullivan, Blair D. and U\c{c}ar, Bora and Yzelman, Albert-Jan},
  title =	{{Recent Trends in Graph Decomposition (Dagstuhl Seminar 23331)}},
  pages =	{1--45},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2024},
  volume =	{13},
  number =	{8},
  editor =	{Karypis, George and Schulz, Christian and Strash, Darren},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagRep.13.8.1},
  URN =		{urn:nbn:de:0030-drops-198114},
  doi =		{10.4230/DagRep.13.8.1},
  annote =	{Keywords: combinatorial optimization, experimental algorithmics, parallel algorithms}
}
Document
FPT Approximation of Generalised Hypertree Width for Bounded Intersection Hypergraphs

Authors: Matthias Lanzinger and Igor Razgon

Published in: LIPIcs, Volume 289, 41st International Symposium on Theoretical Aspects of Computer Science (STACS 2024)


Abstract
Generalised hypertree width (ghw) is a hypergraph parameter that is central to the tractability of many prominent problems with natural hypergraph structure. Computing ghw of a hypergraph is notoriously hard. The decision version of the problem, checking whether ghw(H) ≤ k, is paraNP-hard when parameterised by k. Furthermore, approximation of ghw is at least as hard as approximation of Set-Cover, which is known to not admit any FPT approximation algorithms. Research in the computation of ghw so far has focused on identifying structural restrictions to hypergraphs - such as bounds on the size of edge intersections - that permit XP algorithms for ghw. Yet, even under these restrictions that problem has so far evaded any kind of FPT algorithm. In this paper we make the first step towards FPT algorithms for ghw by showing that the parameter can be approximated in FPT time for graphs of bounded edge intersection size. In concrete terms we show that there exists an FPT algorithm, parameterised by k and d, that for input hypergraph H with maximal cardinality of edge intersections d and integer k either outputs a tree decomposition with ghw(H) ≤ 4k(k+d+1)(2k-1), or rejects, in which case it is guaranteed that ghw(H) > k. Thus, in the special case of hypergraphs of bounded edge intersection, we obtain an FPT O(k³)-approximation algorithm for ghw.

Cite as

Matthias Lanzinger and Igor Razgon. FPT Approximation of Generalised Hypertree Width for Bounded Intersection Hypergraphs. In 41st International Symposium on Theoretical Aspects of Computer Science (STACS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 289, pp. 48:1-48:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{lanzinger_et_al:LIPIcs.STACS.2024.48,
  author =	{Lanzinger, Matthias and Razgon, Igor},
  title =	{{FPT Approximation of Generalised Hypertree Width for Bounded Intersection Hypergraphs}},
  booktitle =	{41st International Symposium on Theoretical Aspects of Computer Science (STACS 2024)},
  pages =	{48:1--48:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-311-9},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{289},
  editor =	{Beyersdorff, Olaf and Kant\'{e}, Mamadou Moustapha and Kupferman, Orna and Lokshtanov, Daniel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2024.48},
  URN =		{urn:nbn:de:0030-drops-197588},
  doi =		{10.4230/LIPIcs.STACS.2024.48},
  annote =	{Keywords: generalized hypertree width, hypergraphs, parameterized algorithms, approximation algorithms}
}
Document
On the Black-Box Complexity of Correlation Intractability

Authors: Nico Döttling and Tamer Mour

Published in: LIPIcs, Volume 287, 15th Innovations in Theoretical Computer Science Conference (ITCS 2024)


Abstract
Correlation intractability is an emerging cryptographic paradigm that enabled several recent breakthroughs in establishing soundness of the Fiat-Shamir transform and, consequently, basing non-interactive zero-knowledge proofs and succinct arguments on standard cryptographic assumptions. In a nutshell, a hash family is said to be correlation intractable for a class of relations ℛ if, for any relation R ∈ ℛ, it is hard given a random hash function h ← H to find an input z s.t. (z,h(z)) ∈ R, namely a correlation. Despite substantial progress in constructing correlation intractable hash functions, all constructions known to date are based on highly-structured hardness assumptions and, further, are of complexity scaling with the circuit complexity of the target relation class. In this work, we initiate the study of the barriers for building correlation intractability. Our main result is a lower bound on the complexity of any black-box construction of CIH from collision resistant hash (CRH), or one-way permutations (OWP), for any sufficiently expressive relation class. In particular, any such construction for a class of relations with circuit complexity t must make at least Ω(t) invocations of the underlying building block. We see this as a first step in developing a methodology towards broader lower bounds.

Cite as

Nico Döttling and Tamer Mour. On the Black-Box Complexity of Correlation Intractability. In 15th Innovations in Theoretical Computer Science Conference (ITCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 287, pp. 40:1-40:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{dottling_et_al:LIPIcs.ITCS.2024.40,
  author =	{D\"{o}ttling, Nico and Mour, Tamer},
  title =	{{On the Black-Box Complexity of Correlation Intractability}},
  booktitle =	{15th Innovations in Theoretical Computer Science Conference (ITCS 2024)},
  pages =	{40:1--40:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-309-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{287},
  editor =	{Guruswami, Venkatesan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2024.40},
  URN =		{urn:nbn:de:0030-drops-195683},
  doi =		{10.4230/LIPIcs.ITCS.2024.40},
  annote =	{Keywords: Correlation Intractability, Fiat-Shamir, Black-box Complexity, Black-box Separations}
}
Document
Unified Almost Linear Kernels for Generalized Covering and Packing Problems on Nowhere Dense Classes

Authors: Jungho Ahn, Jinha Kim, and O-joung Kwon

Published in: LIPIcs, Volume 283, 34th International Symposium on Algorithms and Computation (ISAAC 2023)


Abstract
Let ℱ be a family of graphs, and let p,r be nonnegative integers. For a graph G and an integer k, the (p,r,ℱ)-Covering problem asks whether there is a set D ⊆ V(G) of size at most k such that if the p-th power of G has an induced subgraph isomorphic to a graph in ℱ, then it is at distance at most r from D. The (p,r,ℱ)-Packing problem asks whether G^p has k induced subgraphs H₁,…,H_k such that each H_i is isomorphic to a graph in ℱ, and for i,j ∈ {1,…,k}, the distance between V(H_i) and V(H_j) in G is larger than r. We show that for every fixed nonnegative integers p,r and every fixed nonempty finite family ℱ of connected graphs, (p,r,ℱ)-Covering with p ≤ 2r+1 and (p,r,ℱ)-Packing with p ≤ 2⌊r/2⌋+1 admit almost linear kernels on every nowhere dense class of graphs, parameterized by the solution size k. As corollaries, we prove that Distance-r Vertex Cover, Distance-r Matching, ℱ-Free Vertex Deletion, and Induced-ℱ-Packing for any fixed finite family ℱ of connected graphs admit almost linear kernels on every nowhere dense class of graphs. Our results extend the results for Distance-r Dominating Set by Drange et al. (STACS 2016) and Eickmeyer et al. (ICALP 2017), and for Distance-r Independent Set by Pilipczuk and Siebertz (EJC 2021).

Cite as

Jungho Ahn, Jinha Kim, and O-joung Kwon. Unified Almost Linear Kernels for Generalized Covering and Packing Problems on Nowhere Dense Classes. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 5:1-5:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{ahn_et_al:LIPIcs.ISAAC.2023.5,
  author =	{Ahn, Jungho and Kim, Jinha and Kwon, O-joung},
  title =	{{Unified Almost Linear Kernels for Generalized Covering and Packing Problems on Nowhere Dense Classes}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{5:1--5:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.5},
  URN =		{urn:nbn:de:0030-drops-193072},
  doi =		{10.4230/LIPIcs.ISAAC.2023.5},
  annote =	{Keywords: kernelization, independent set, dominating set, covering, packing}
}
Document
Matching Cuts in Graphs of High Girth and H-Free Graphs

Authors: Carl Feghali, Felicia Lucke, Daniël Paulusma, and Bernard Ries

Published in: LIPIcs, Volume 283, 34th International Symposium on Algorithms and Computation (ISAAC 2023)


Abstract
The (Perfect) Matching Cut problem is to decide if a connected graph has a (perfect) matching that is also an edge cut. The Disconnected Perfect Matching problem is to decide if a connected graph has a perfect matching that contains a matching cut. Both Matching Cut and Disconnected Perfect Matching are NP-complete for planar graphs of girth 5, whereas Perfect Matching Cut is known to be NP-complete even for subcubic bipartite graphs of arbitrarily large fixed girth. We prove that Matching Cut and Disconnected Perfect Matching are also NP-complete for bipartite graphs of arbitrarily large fixed girth and bounded maximum degree. Our result for Matching Cut resolves a 20-year old open problem. We also show that the more general problem d-Cut, for every fixed d ≥ 1, is NP-complete for bipartite graphs of arbitrarily large fixed girth and bounded maximum degree. Furthermore, we show that Matching Cut, Perfect Matching Cut and Disconnected Perfect Matching are NP-complete for H-free graphs whenever H contains a connected component with two vertices of degree at least 3. Afterwards, we update the state-of-the-art summaries for H-free graphs and compare them with each other, and with a known and full classification of the Maximum Matching Cut problem, which is to determine a largest matching cut of a graph G. Finally, by combining existing results, we obtain a complete complexity classification of Perfect Matching Cut for H-subgraph-free graphs where H is any finite set of graphs.

Cite as

Carl Feghali, Felicia Lucke, Daniël Paulusma, and Bernard Ries. Matching Cuts in Graphs of High Girth and H-Free Graphs. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 31:1-31:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{feghali_et_al:LIPIcs.ISAAC.2023.31,
  author =	{Feghali, Carl and Lucke, Felicia and Paulusma, Dani\"{e}l and Ries, Bernard},
  title =	{{Matching Cuts in Graphs of High Girth and H-Free Graphs}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{31:1--31:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.31},
  URN =		{urn:nbn:de:0030-drops-193332},
  doi =		{10.4230/LIPIcs.ISAAC.2023.31},
  annote =	{Keywords: matching cut, perfect matching, girth, H-free graph}
}
Document
Invited Talk
The CP-SAT-LP Solver (Invited Talk)

Authors: Laurent Perron, Frédéric Didier, and Steven Gay

Published in: LIPIcs, Volume 280, 29th International Conference on Principles and Practice of Constraint Programming (CP 2023)


Abstract
The CP-SAT-LP solver is developed by the Operations Research team at Google and is part of the OR-Tools [Laurent Perron and Vincent Furnon, 2023] open-source optimization suite. It is an implementation of a purely integral Constraint Programming solver on top of a SAT solver using Lazy Clause Generation [Stuckey, 2010]. It draws its inspiration from the chuffed solver [Geoffrey Chu et al., 2023], and from the CP 2013 plenary by Peter Stuckey on Lazy Clause Generation [Stuckey, 2013]. The CP-SAT-LP solver improves upon the chuffed solver [Geoffrey Chu et al., 2023] in two main directions. First, it uses a simplex alongside the SAT engine. Second, it implements and relies upon a portfolio of diverse workers for its search part. The use of the simplex brings the obvious advantages of a linear relaxation on the linear part of the full model. It also started the integration of MIP technology into CP-SAT-LP. This is a huge endeavour, as MIP solvers are mature and complex. It includes presolve - which was already a part of CP-SAT -, dual reductions, specific branching rules, cuts, reduced cost fixing, and more advanced techniques. It also allows to integrate tightly the research from the Scheduling on MIP community [Balas, 1985; Applegate and Cook, 1991; Maurice Queyranne, 1993] along with the most advanced scheduling algorithms [Vilím, 2011]. This has enabled breakthroughs in solving and proving hard scheduling instances of the Job-Shop problems [Ding et al., 2019] and Resource Constraint Project Scheduling Problems [Rainer Kolisch and Arno Sprecher, 1997; Artigues et al., 2008]. Using a portfolio of different workers makes it easier to try new ideas and to incorporate orthogonal techniques with little complication, except controlling the explosion of potential workers. These workers can be categorized along multiple criteria like finding primal solutions - either using complete solvers, Local Search [Luteberget and Sartor, 2023] or Large Neighborhood Search [Paul Shaw, 1998] -, improving dual bounds, trying to reduce the problem with the help of continuous probing. This diversity of behaviors has increased the robustness of the solver, while the continuous sharing of information between workers has produced massive speedups when running multiple workers in parallel. All in all, CP-SAT-LP is a state-of-the-art solver, with unsurpassed performance in the Constraint Programming community, breakthrough results on Scheduling benchmarks (with the closure of many open problems), and competitive results with the best MIP solvers (on purely integral problems).

Cite as

Laurent Perron, Frédéric Didier, and Steven Gay. The CP-SAT-LP Solver (Invited Talk). In 29th International Conference on Principles and Practice of Constraint Programming (CP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 280, pp. 3:1-3:2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{perron_et_al:LIPIcs.CP.2023.3,
  author =	{Perron, Laurent and Didier, Fr\'{e}d\'{e}ric and Gay, Steven},
  title =	{{The CP-SAT-LP Solver}},
  booktitle =	{29th International Conference on Principles and Practice of Constraint Programming (CP 2023)},
  pages =	{3:1--3:2},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-300-3},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{280},
  editor =	{Yap, Roland H. C.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2023.3},
  URN =		{urn:nbn:de:0030-drops-190405},
  doi =		{10.4230/LIPIcs.CP.2023.3},
  annote =	{Keywords: Constraint Programming, Operations Research, Sat Solver}
}
Document
Binary Constraint Trees and Structured Decomposability

Authors: Petr Kučera

Published in: LIPIcs, Volume 280, 29th International Conference on Principles and Practice of Constraint Programming (CP 2023)


Abstract
A binary constraint tree (BCT, Wang and Yap 2022) is a normalized binary CSP whose constraint graph is a tree. A BCT constraint is a constraint represented with a BCT where some of the variables may be hidden (i.e. existentially quantified and used only for internal representation). Structured decomposable negation normal forms (SDNNF) were introduced by Pipatsrisawat and Darwiche (2008) as a restriction of decomposable negation normal forms (DNNF). Both DNNFs and SDNNFs were studied in the area of knowledge compilation. In this paper we show that the BCT constraints are polynomially equivalent to SDNNFs. In particular, a BCT constraint can be represented with an SDNNF of polynomial size and, on the other hand, a constraint that can be represented with an SDNNF, can be represented as a BCT constraint of polynomial size. This generalizes the result of Wang and Yap (2022) that shows that a multivalued decision diagram (MDD) can be represented with a BCT . Moreover, our result provides a full characterization of binary constraint trees using a language that is well studied in the area of knowledge compilation. It was shown by Wang and Yap (2023) that a CSP on n variables of domain sizes bounded by d that has treewidth k can be encoded as a BCT on O(n) variables with domain sizes O(d^{k+1}). We provide an alternative reduction for the case of binary CSPs. This allows us to compile any binary CSP to an SDNNF of size that is parameterized by d and k.

Cite as

Petr Kučera. Binary Constraint Trees and Structured Decomposability. In 29th International Conference on Principles and Practice of Constraint Programming (CP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 280, pp. 22:1-22:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{kucera:LIPIcs.CP.2023.22,
  author =	{Ku\v{c}era, Petr},
  title =	{{Binary Constraint Trees and Structured Decomposability}},
  booktitle =	{29th International Conference on Principles and Practice of Constraint Programming (CP 2023)},
  pages =	{22:1--22:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-300-3},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{280},
  editor =	{Yap, Roland H. C.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2023.22},
  URN =		{urn:nbn:de:0030-drops-190595},
  doi =		{10.4230/LIPIcs.CP.2023.22},
  annote =	{Keywords: Binary CSP, Binary Constraint Tree, Structured Decomposability, Strucured DNNF, Polynomial Equivalence}
}
Document
A Parameterized Algorithm for Vertex Connectivity Survivable Network Design Problem with Uniform Demands

Authors: Jørgen Bang-Jensen, Kristine Vitting Klinkby, Pranabendu Misra, and Saket Saurabh

Published in: LIPIcs, Volume 274, 31st Annual European Symposium on Algorithms (ESA 2023)


Abstract
In the Vertex Connectivity Survivable Network Design (VC-SNDP) problem, the input is a graph G and a function d: V(G) × V(G) → ℕ that encodes the vertex-connectivity demands between pairs of vertices. The objective is to find the smallest subgraph H of G that satisfies all these demands. It is a well-studied NP-complete problem that generalizes several network design problems. We consider the case of uniform demands, where for every vertex pair (u,v) the connectivity demand d(u,v) is a fixed integer κ. It is an important problem with wide applications. We study this problem in the realm of Parameterized Complexity. In this setting, in addition to G and d we are given an integer 𝓁 as the parameter and the objective is to determine if we can remove at least 𝓁 edges from G without violating any connectivity constraints. This was posed as an open problem by Bang-Jansen et.al. [SODA 2018], who studied the edge-connectivity variant of the problem under the same settings. Using a powerful classification result of Lokshtanov et al. [ICALP 2018], Gutin et al. [JCSS 2019] recently showed that this problem admits a (non-uniform) FPT algorithm where the running time was unspecified. Further they also gave an (uniform) FPT algorithm for the case of κ = 2. In this paper we present a (uniform) FPT algorithm any κ that runs in time 2^{O(κ² 𝓁⁴ log 𝓁)}⋅ |V(G)|^O(1). Our algorithm is built upon new insights on vertex connectivity in graphs. Our main conceptual contribution is a novel graph decomposition called the Wheel decomposition. Informally, it is a partition of the edge set of a graph G, E(G) = X₁ ∪ X₂ … ∪ X_r, with the parts arranged in a cyclic order, such that each vertex v ∈ V(G) either has edges in at most two consecutive parts, or has edges in every part of this partition. The first kind of vertices can be thought of as the rim of the wheel, while the second kind form the hub. Additionally, the vertex cuts induced by these edge-sets in G have highly symmetric properties. Our main technical result, informally speaking, establishes that "nearly edge-minimal’’ κ-vertex connected graphs admit a wheel decomposition - a fact that can be exploited for designing algorithms. We believe that this decomposition is of independent interest and it could be a useful tool in resolving other open problems.

Cite as

Jørgen Bang-Jensen, Kristine Vitting Klinkby, Pranabendu Misra, and Saket Saurabh. A Parameterized Algorithm for Vertex Connectivity Survivable Network Design Problem with Uniform Demands. In 31st Annual European Symposium on Algorithms (ESA 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 274, pp. 13:1-13:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bangjensen_et_al:LIPIcs.ESA.2023.13,
  author =	{Bang-Jensen, J{\o}rgen and Klinkby, Kristine Vitting and Misra, Pranabendu and Saurabh, Saket},
  title =	{{A Parameterized Algorithm for Vertex Connectivity Survivable Network Design Problem with Uniform Demands}},
  booktitle =	{31st Annual European Symposium on Algorithms (ESA 2023)},
  pages =	{13:1--13:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-295-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{274},
  editor =	{G{\o}rtz, Inge Li and Farach-Colton, Martin and Puglisi, Simon J. and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2023.13},
  URN =		{urn:nbn:de:0030-drops-186663},
  doi =		{10.4230/LIPIcs.ESA.2023.13},
  annote =	{Keywords: Parameterized Complexity, Vertex Connectivity, Network Design}
}
Document
Finding Maximal Exact Matches in Graphs

Authors: Nicola Rizzo, Manuel Cáceres, and Veli Mäkinen

Published in: LIPIcs, Volume 273, 23rd International Workshop on Algorithms in Bioinformatics (WABI 2023)


Abstract
We study the problem of finding maximal exact matches (MEMs) between a query string Q and a labeled graph G. MEMs are an important class of seeds, often used in seed-chain-extend type of practical alignment methods because of their strong connections to classical metrics. A principled way to speed up chaining is to limit the number of MEMs by considering only MEMs of length at least κ (κ-MEMs). However, on arbitrary input graphs, the problem of finding MEMs cannot be solved in truly sub-quadratic time under SETH (Equi et al., ICALP 2019) even on acyclic graphs. In this paper we show an O(n⋅ L ⋅ d^{L-1} + m + M_{κ,L})-time algorithm finding all κ-MEMs between Q and G spanning exactly L nodes in G, where n is the total length of node labels, d is the maximum degree of a node in G, m = |Q|, and M_{κ,L} is the number of output MEMs. We use this algorithm to develop a κ-MEM finding solution on indexable Elastic Founder Graphs (Equi et al., Algorithmica 2022) running in time O(nH² + m + M_κ), where H is the maximum number of nodes in a block, and M_κ is the total number of κ-MEMs. Our results generalize to the analysis of multiple query strings (MEMs between G and any of the strings). Additionally, we provide some preliminary experimental results showing that the number of graph MEMs is an order of magnitude smaller than the number of string MEMs of the corresponding concatenated collection.

Cite as

Nicola Rizzo, Manuel Cáceres, and Veli Mäkinen. Finding Maximal Exact Matches in Graphs. In 23rd International Workshop on Algorithms in Bioinformatics (WABI 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 273, pp. 10:1-10:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{rizzo_et_al:LIPIcs.WABI.2023.10,
  author =	{Rizzo, Nicola and C\'{a}ceres, Manuel and M\"{a}kinen, Veli},
  title =	{{Finding Maximal Exact Matches in Graphs}},
  booktitle =	{23rd International Workshop on Algorithms in Bioinformatics (WABI 2023)},
  pages =	{10:1--10:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-294-5},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{273},
  editor =	{Belazzougui, Djamal and Ouangraoua, A\"{i}da},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.WABI.2023.10},
  URN =		{urn:nbn:de:0030-drops-186364},
  doi =		{10.4230/LIPIcs.WABI.2023.10},
  annote =	{Keywords: Sequence to graph alignment, bidirectional BWT, r-index, suffix tree, founder graphs}
}
Document
On Property Testing of the Binary Rank

Authors: Nader H. Bshouty

Published in: LIPIcs, Volume 272, 48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023)


Abstract
Let M be an n × m (0,1)-matrix. We define the s-binary rank, denoted as br_s(M), of M as the minimum integer d such that there exist d monochromatic rectangles covering all the 1-entries in the matrix, with each 1-entry being covered by at most s rectangles. When s = 1, this corresponds to the binary rank, denoted as br(M), which is well-known in the literature and has many applications. Let R(M) and C(M) denote the sets of rows and columns of M, respectively. Using the result of Sgall [Jiří Sgall, 1999], we establish that if M has an s-binary rank at most d, then |R(M)| ⋅ |C(M)| ≤ binom(d, ≤ s)2^d, where binom(d, ≤ s) = ∑_{i=0}^s binom(d,i). This bound is tight, meaning that there exists a matrix M' with an s-binary rank of d, for which |R(M')| ⋅ |C(M')| = binom(d, ≤ s)2^d. Using this result, we present novel one-sided adaptive and non-adaptive testers for (0,1)-matrices with an s-binary rank at most d (and exactly d). These testers require Õ(binom(d, ≤ s)2^d/ε) and Õ(binom(d, ≤ s)2^d/ε²) queries, respectively. For a fixed s, this improves upon the query complexity of the tester proposed by Parnas et al. in [Michal Parnas et al., 2021] by a factor of Θ(2^d).

Cite as

Nader H. Bshouty. On Property Testing of the Binary Rank. In 48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 272, pp. 27:1-27:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bshouty:LIPIcs.MFCS.2023.27,
  author =	{Bshouty, Nader H.},
  title =	{{On Property Testing of the Binary Rank}},
  booktitle =	{48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023)},
  pages =	{27:1--27:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-292-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{272},
  editor =	{Leroux, J\'{e}r\^{o}me and Lombardy, Sylvain and Peleg, David},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2023.27},
  URN =		{urn:nbn:de:0030-drops-185616},
  doi =		{10.4230/LIPIcs.MFCS.2023.27},
  annote =	{Keywords: Property testing, binary rank, Boolean rank}
}
Document
Sparse Euclidean Spanners with Optimal Diameter: A General and Robust Lower Bound via a Concave Inverse-Ackermann Function

Authors: Hung Le, Lazar Milenković, and Shay Solomon

Published in: LIPIcs, Volume 258, 39th International Symposium on Computational Geometry (SoCG 2023)


Abstract
In STOC'95 [S. Arya et al., 1995] Arya et al. showed that any set of n points in ℝ^d admits a (1+ε)-spanner with hop-diameter at most 2 (respectively, 3) and O(n log n) edges (resp., O(n log log n) edges). They also gave a general upper bound tradeoff of hop-diameter k with O(n α_k(n)) edges, for any k ≥ 2. The function α_k is the inverse of a certain Ackermann-style function, where α₀(n) = ⌈n/2⌉, α₁(n) = ⌈√n⌉, α₂(n) = ⌈log n⌉, α₃(n) = ⌈log log n⌉, α₄(n) = log^* n, α₅(n) = ⌊ 1/2 log^*n ⌋, …. Roughly speaking, for k ≥ 2 the function α_{k} is close to ⌊(k-2)/2⌋-iterated log-star function, i.e., log with ⌊(k-2)/2⌋ stars. Despite a large body of work on spanners of bounded hop-diameter, the fundamental question of whether this tradeoff between size and hop-diameter of Euclidean (1+ε)-spanners is optimal has remained open, even in one-dimensional spaces. Three lower bound tradeoffs are known: - An optimal k versus Ω(n α_k(n)) by Alon and Schieber [N. Alon and B. Schieber, 1987], but it applies to stretch 1 (not 1+ε). - A suboptimal k versus Ω(nα_{2k+6}(n)) by Chan and Gupta [H. T.-H. Chan and A. Gupta, 2006]. - A suboptimal k versus Ω(n/(2^{6⌊k/2⌋}) α_k(n)) by Le et al. [Le et al., 2022]. This paper establishes the optimal k versus Ω(n α_k(n)) lower bound tradeoff for stretch 1+ε, for any ε > 0, and for any k. An important conceptual contribution of this work is in achieving optimality by shaving off an extremely slowly growing term, namely 2^{6⌊k/2⌋} for k ≤ O(α(n)); such a fine-grained optimization (that achieves optimality) is very rare in the literature. To shave off the 2^{6⌊k/2⌋} term from the previous bound of Le et al., our argument has to drill much deeper. In particular, we propose a new way of analyzing recurrences that involve inverse-Ackermann style functions, and our key technical contribution is in presenting the first explicit construction of concave versions of these functions. An important advantage of our approach over previous ones is its robustness: While all previous lower bounds are applicable only to restricted 1-dimensional point sets, ours applies even to random point sets in constant-dimensional spaces.

Cite as

Hung Le, Lazar Milenković, and Shay Solomon. Sparse Euclidean Spanners with Optimal Diameter: A General and Robust Lower Bound via a Concave Inverse-Ackermann Function. In 39th International Symposium on Computational Geometry (SoCG 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 258, pp. 47:1-47:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{le_et_al:LIPIcs.SoCG.2023.47,
  author =	{Le, Hung and Milenkovi\'{c}, Lazar and Solomon, Shay},
  title =	{{Sparse Euclidean Spanners with Optimal Diameter: A General and Robust Lower Bound via a Concave Inverse-Ackermann Function}},
  booktitle =	{39th International Symposium on Computational Geometry (SoCG 2023)},
  pages =	{47:1--47:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-273-0},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{258},
  editor =	{Chambers, Erin W. and Gudmundsson, Joachim},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2023.47},
  URN =		{urn:nbn:de:0030-drops-178976},
  doi =		{10.4230/LIPIcs.SoCG.2023.47},
  annote =	{Keywords: Euclidean spanners, Ackermann functions, convex functions, hop-diameter}
}
Document
Refactoring = Substitution + Rewriting: Towards Generic, Language-Independent Refactorings

Authors: Simon Thompson and Dániel Horpácsi

Published in: OASIcs, Volume 109, Eelco Visser Commemorative Symposium (EVCS 2023)


Abstract
Eelco Visser’s work has always encouraged stepping back from the particular to look at the underlying, conceptual problems. In that spirit we present an approach to describing refactorings that abstracts away from particular refactorings to classes of similar transformations, and presents an implementation of these that works by substitution and subsequent rewriting. Substitution is language-independent under this approach, while the rewrites embody language-specific aspects. Intriguingly, it also goes back to work on API migration by Huiqing Li and the first author, and sets refactoring in that general context.

Cite as

Simon Thompson and Dániel Horpácsi. Refactoring = Substitution + Rewriting: Towards Generic, Language-Independent Refactorings. In Eelco Visser Commemorative Symposium (EVCS 2023). Open Access Series in Informatics (OASIcs), Volume 109, pp. 26:1-26:9, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{thompson_et_al:OASIcs.EVCS.2023.26,
  author =	{Thompson, Simon and Horp\'{a}csi, D\'{a}niel},
  title =	{{Refactoring = Substitution + Rewriting: Towards Generic, Language-Independent Refactorings}},
  booktitle =	{Eelco Visser Commemorative Symposium (EVCS 2023)},
  pages =	{26:1--26:9},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-267-9},
  ISSN =	{2190-6807},
  year =	{2023},
  volume =	{109},
  editor =	{L\"{a}mmel, Ralf and Mosses, Peter D. and Steimann, Friedrich},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.EVCS.2023.26},
  URN =		{urn:nbn:de:0030-drops-177961},
  doi =		{10.4230/OASIcs.EVCS.2023.26},
  annote =	{Keywords: refactoring, generic, language independent, rewriting, substitution, API upgrade}
}
Document
Approximating Highly Inapproximable Problems on Graphs of Bounded Twin-Width

Authors: Pierre Bergé, Édouard Bonnet, Hugues Déprés, and Rémi Watrigant

Published in: LIPIcs, Volume 254, 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)


Abstract
For any ε > 0, we give a polynomial-time n^ε-approximation algorithm for Max Independent Set in graphs of bounded twin-width given with an O(1)-sequence. This result is derived from the following time-approximation trade-off: We establish an O(1)^{2^q-1}-approximation algorithm running in time exp(O_q(n^{2^{-q}})), for every integer q ⩾ 0. Guided by the same framework, we obtain similar approximation algorithms for Min Coloring and Max Induced Matching. In general graphs, all these problems are known to be highly inapproximable: for any ε > 0, a polynomial-time n^{1-ε}-approximation for any of them would imply that P=NP [Håstad, FOCS '96; Zuckerman, ToC '07; Chalermsook et al., SODA '13]. We generalize the algorithms for Max Independent Set and Max Induced Matching to the independent (induced) packing of any fixed connected graph H. In contrast, we show that such approximation guarantees on graphs of bounded twin-width given with an O(1)-sequence are very unlikely for Min Independent Dominating Set, and somewhat unlikely for Longest Path and Longest Induced Path. Regarding the existence of better approximation algorithms, there is a (very) light evidence that the obtained approximation factor of n^ε for Max Independent Set may be best possible. This is the first in-depth study of the approximability of problems in graphs of bounded twin-width. Prior to this paper, essentially the only such result was a polynomial-time O(1)-approximation algorithm for Min Dominating Set [Bonnet et al., ICALP '21].

Cite as

Pierre Bergé, Édouard Bonnet, Hugues Déprés, and Rémi Watrigant. Approximating Highly Inapproximable Problems on Graphs of Bounded Twin-Width. In 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 254, pp. 10:1-10:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{berge_et_al:LIPIcs.STACS.2023.10,
  author =	{Berg\'{e}, Pierre and Bonnet, \'{E}douard and D\'{e}pr\'{e}s, Hugues and Watrigant, R\'{e}mi},
  title =	{{Approximating Highly Inapproximable Problems on Graphs of Bounded Twin-Width}},
  booktitle =	{40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)},
  pages =	{10:1--10:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-266-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{254},
  editor =	{Berenbrink, Petra and Bouyer, Patricia and Dawar, Anuj and Kant\'{e}, Mamadou Moustapha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2023.10},
  URN =		{urn:nbn:de:0030-drops-176629},
  doi =		{10.4230/LIPIcs.STACS.2023.10},
  annote =	{Keywords: Approximation algorithms, bounded twin-width}
}
Document
On the Multilinear Complexity of Associative Algebras

Authors: Markus Bläser, Hendrik Mayer, and Devansh Shringi

Published in: LIPIcs, Volume 254, 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)


Abstract
Christandl and Zuiddam [Matthias Christandl and Jeroen Zuiddam, 2019] study the multilinear complexity of d-fold matrix multiplication in the context of quantum communication complexity. Bshouty [Nader H. Bshouty, 2013] investigates the multilinear complexity of d-fold multiplication in commutative algebras to understand the size of so-called testers. The study of bilinear complexity is a classical topic in algebraic complexity theory, starting with the work by Strassen. However, there has been no systematic study of the multilinear complexity of multilinear maps. In the present work, we systematically investigate the multilinear complexity of d-fold multiplication in arbitrary associative algebras. We prove a multilinear generalization of the famous Alder-Strassen theorem, which is a lower bound for the bilinear complexity of the (2-fold) multiplication in an associative algebra. We show that the multilinear complexity of the d-fold multiplication has a lower bound of d ⋅ dim A - (d-1)t, where t is the number of maximal twosided ideals in A. This is optimal in the sense that there are algebras for which this lower bound is tight. Furthermore, we prove the following dichotomy that the quotient algebra A/rad A determines the complexity of the d-fold multiplication in A: When the semisimple algebra A/rad A is commutative, then the multilinear complexity of the d-fold multiplication in A is polynomial in d. On the other hand, when A/rad A is noncommutative, then the multilinear complexity of the d-fold multiplication in A is exponential in d.

Cite as

Markus Bläser, Hendrik Mayer, and Devansh Shringi. On the Multilinear Complexity of Associative Algebras. In 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 254, pp. 12:1-12:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{blaser_et_al:LIPIcs.STACS.2023.12,
  author =	{Bl\"{a}ser, Markus and Mayer, Hendrik and Shringi, Devansh},
  title =	{{On the Multilinear Complexity of Associative Algebras}},
  booktitle =	{40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)},
  pages =	{12:1--12:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-266-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{254},
  editor =	{Berenbrink, Petra and Bouyer, Patricia and Dawar, Anuj and Kant\'{e}, Mamadou Moustapha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2023.12},
  URN =		{urn:nbn:de:0030-drops-176645},
  doi =		{10.4230/LIPIcs.STACS.2023.12},
  annote =	{Keywords: Multilinear computations, associative algebras, matrix multiplication, Alder-Strassen theorem}
}
Document
Barriers for Faster Dimensionality Reduction

Authors: Ora Nova Fandina, Mikael Møller Høgsgaard, and Kasper Green Larsen

Published in: LIPIcs, Volume 254, 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)


Abstract
The Johnson-Lindenstrauss transform allows one to embed a dataset of n points in ℝ^d into ℝ^m, while preserving the pairwise distance between any pair of points up to a factor (1 ± ε), provided that m = Ω(ε^{-2} lg n). The transform has found an overwhelming number of algorithmic applications, allowing to speed up algorithms and reducing memory consumption at the price of a small loss in accuracy. A central line of research on such transforms, focus on developing fast embedding algorithms, with the classic example being the Fast JL transform by Ailon and Chazelle. All known such algorithms have an embedding time of Ω(d lg d), but no lower bounds rule out a clean O(d) embedding time. In this work, we establish the first non-trivial lower bounds (of magnitude Ω(m lg m)) for a large class of embedding algorithms, including in particular most known upper bounds.

Cite as

Ora Nova Fandina, Mikael Møller Høgsgaard, and Kasper Green Larsen. Barriers for Faster Dimensionality Reduction. In 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 254, pp. 31:1-31:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{novafandina_et_al:LIPIcs.STACS.2023.31,
  author =	{Nova Fandina, Ora and M{\o}ller H{\o}gsgaard, Mikael and Green Larsen, Kasper},
  title =	{{Barriers for Faster Dimensionality Reduction}},
  booktitle =	{40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)},
  pages =	{31:1--31:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-266-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{254},
  editor =	{Berenbrink, Petra and Bouyer, Patricia and Dawar, Anuj and Kant\'{e}, Mamadou Moustapha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2023.31},
  URN =		{urn:nbn:de:0030-drops-176838},
  doi =		{10.4230/LIPIcs.STACS.2023.31},
  annote =	{Keywords: Dimensional reduction, Lower bound, Linear Circuits}
}
  • Refine by Author
  • 9 Saurabh, Saket
  • 8 Rzążewski, Paweł
  • 7 Marx, Dániel
  • 6 Mustafa, Nabil H.
  • 5 Kanesh, Lawqueen
  • Show More...

  • Refine by Classification
  • 20 Theory of computation → Graph algorithms analysis
  • 17 Theory of computation → Computational geometry
  • 16 Theory of computation → Parameterized complexity and exact algorithms
  • 13 Mathematics of computing → Graph algorithms
  • 11 Theory of computation → Problems, reductions and completeness
  • Show More...

  • Refine by Keyword
  • 7 approximation algorithms
  • 5 Parameterized Complexity
  • 5 kernelization
  • 4 Approximation Algorithms
  • 4 Approximation algorithms
  • Show More...

  • Refine by Type
  • 224 document

  • Refine by Publication Year
  • 39 2018
  • 25 2022
  • 23 2017
  • 22 2019
  • 22 2020
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail