106 Search Results for "Zehavi, Meirav"


Volume

LIPIcs, Volume 214

16th International Symposium on Parameterized and Exact Computation (IPEC 2021)

IPEC 2021, September 8-10, 2021, Lisbon, Portugal

Editors: Petr A. Golovach and Meirav Zehavi

Document
Baby PIH: Parameterized Inapproximability of Min CSP

Authors: Venkatesan Guruswami, Xuandi Ren, and Sai Sandeep

Published in: LIPIcs, Volume 300, 39th Computational Complexity Conference (CCC 2024)


Abstract
The Parameterized Inapproximability Hypothesis (PIH) is the analog of the PCP theorem in the world of parameterized complexity. It asserts that no FPT algorithm can distinguish a satisfiable 2CSP instance from one which is only (1-ε)-satisfiable (where the parameter is the number of variables) for some constant 0 < ε < 1. We consider a minimization version of CSPs (Min-CSP), where one may assign r values to each variable, and the goal is to ensure that every constraint is satisfied by some choice among the r × r pairs of values assigned to its variables (call such a CSP instance r-list-satisfiable). We prove the following strong parameterized inapproximability for Min CSP: For every r ≥ 1, it is W[1]-hard to tell if a 2CSP instance is satisfiable or is not even r-list-satisfiable. We refer to this statement as "Baby PIH", following the recently proved Baby PCP Theorem (Barto and Kozik, 2021). Our proof adapts the combinatorial arguments underlying the Baby PCP theorem, overcoming some basic obstacles that arise in the parameterized setting. Furthermore, our reduction runs in time polynomially bounded in both the number of variables and the alphabet size, and thus implies the Baby PCP theorem as well.

Cite as

Venkatesan Guruswami, Xuandi Ren, and Sai Sandeep. Baby PIH: Parameterized Inapproximability of Min CSP. In 39th Computational Complexity Conference (CCC 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 300, pp. 27:1-27:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{guruswami_et_al:LIPIcs.CCC.2024.27,
  author =	{Guruswami, Venkatesan and Ren, Xuandi and Sandeep, Sai},
  title =	{{Baby PIH: Parameterized Inapproximability of Min CSP}},
  booktitle =	{39th Computational Complexity Conference (CCC 2024)},
  pages =	{27:1--27:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-331-7},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{300},
  editor =	{Santhanam, Rahul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2024.27},
  URN =		{urn:nbn:de:0030-drops-204237},
  doi =		{10.4230/LIPIcs.CCC.2024.27},
  annote =	{Keywords: Parameterized Inapproximability Hypothesis, Constraint Satisfaction Problems}
}
Document
Determining Fixed-Length Paths in Directed and Undirected Edge-Weighted Graphs

Authors: Daniel Hambly, Rhyd Lewis, and Padraig Corcoran

Published in: LIPIcs, Volume 301, 22nd International Symposium on Experimental Algorithms (SEA 2024)


Abstract
In this paper, we examine the NP-hard problem of identifying fixed-length s-t paths in edge-weighted graphs - that is, a path of a desired length k from a source vertex s to a target vertex t. Many existing strategies look at paths whose lengths are determined by the number of edges in the path. We, however, look at the length of the path as the sum of the edge weights. Here, three exact algorithms for this problem are proposed: the first based on an integer programming (IP) formulation, the second a backtracking algorithm, and the third based on an extension of Yen’s algorithm. Analysis of these algorithms on random graphs shows that the backtracking algorithm performs best on smaller values of k, whilst the IP is preferable for larger values of k.

Cite as

Daniel Hambly, Rhyd Lewis, and Padraig Corcoran. Determining Fixed-Length Paths in Directed and Undirected Edge-Weighted Graphs. In 22nd International Symposium on Experimental Algorithms (SEA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 301, pp. 15:1-15:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{hambly_et_al:LIPIcs.SEA.2024.15,
  author =	{Hambly, Daniel and Lewis, Rhyd and Corcoran, Padraig},
  title =	{{Determining Fixed-Length Paths in Directed and Undirected Edge-Weighted Graphs}},
  booktitle =	{22nd International Symposium on Experimental Algorithms (SEA 2024)},
  pages =	{15:1--15:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-325-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{301},
  editor =	{Liberti, Leo},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SEA.2024.15},
  URN =		{urn:nbn:de:0030-drops-203805},
  doi =		{10.4230/LIPIcs.SEA.2024.15},
  annote =	{Keywords: Graphs, paths, backtracking, integer programming, Yen’s algorithm}
}
Document
Track A: Algorithms, Complexity and Games
Computing Tree Decompositions with Small Independence Number

Authors: Clément Dallard, Fedor V. Fomin, Petr A. Golovach, Tuukka Korhonen, and Martin Milanič

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
The independence number of a tree decomposition is the maximum of the independence numbers of the subgraphs induced by its bags. The tree-independence number of a graph is the minimum independence number of a tree decomposition of it. Several NP-hard graph problems, like maximum weight independent set, can be solved in time n^𝒪(k) if the input n-vertex graph is given together with a tree decomposition of independence number k. Yolov in [SODA 2018] gave an algorithm that given an n-vertex graph G and an integer k, in time n^𝒪(k³) either constructs a tree decomposition of G whose independence number is 𝒪(k³) or correctly reports that the tree-independence number of G is larger than k. In this paper, we first give an algorithm for computing the tree-independence number with a better approximation ratio and running time and then prove that our algorithm is, in some sense, the best one can hope for. More precisely, our algorithm runs in time 2^𝒪(k²) n^𝒪(k) and either outputs a tree decomposition of G with independence number at most 8k, or determines that the tree-independence number of G is larger than k. This implies 2^𝒪(k²) n^𝒪(k)-time algorithms for various problems, like maximum weight independent set, parameterized by the tree-independence number k without needing the decomposition as an input. Assuming Gap-ETH, an n^Ω(k) factor in the running time is unavoidable for any approximation algorithm for the tree-independence number. Our second result is that the exact computation of the tree-independence number is para-NP-hard: We show that for every constant k ≥ 4 it is NP-hard to decide if a given graph has the tree-independence number at most k.

Cite as

Clément Dallard, Fedor V. Fomin, Petr A. Golovach, Tuukka Korhonen, and Martin Milanič. Computing Tree Decompositions with Small Independence Number. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 51:1-51:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{dallard_et_al:LIPIcs.ICALP.2024.51,
  author =	{Dallard, Cl\'{e}ment and Fomin, Fedor V. and Golovach, Petr A. and Korhonen, Tuukka and Milani\v{c}, Martin},
  title =	{{Computing Tree Decompositions with Small Independence Number}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{51:1--51:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.51},
  URN =		{urn:nbn:de:0030-drops-201945},
  doi =		{10.4230/LIPIcs.ICALP.2024.51},
  annote =	{Keywords: tree-independence number, approximation, parameterized algorithms}
}
Document
Track A: Algorithms, Complexity and Games
Nearly Optimal Independence Oracle Algorithms for Edge Estimation in Hypergraphs

Authors: Holger Dell, John Lapinskas, and Kitty Meeks

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
Consider a query model of computation in which an n-vertex k-hypergraph can be accessed only via its independence oracle or via its colourful independence oracle, and each oracle query may incur a cost depending on the size of the query. Several recent results (Dell and Lapinskas, STOC 2018; Dell, Lapinskas, and Meeks, SODA 2020) give efficient algorithms to approximately count the hypergraph’s edges in the colourful setting. These algorithms immediately imply fine-grained reductions from approximate counting to decision, with overhead only log^Θ(k) n over the running time n^α of the original decision algorithm, for many well-studied problems including k-Orthogonal Vectors, k-SUM, subgraph isomorphism problems including k-Clique and colourful-H, graph motifs, and k-variable first-order model checking. We explore the limits of what is achievable in this setting, obtaining unconditional lower bounds on the oracle cost of algorithms to approximately count the hypergraph’s edges in both the colourful and uncoloured settings. In both settings, we also obtain algorithms which essentially match these lower bounds; in the colourful setting, this requires significant changes to the algorithm of Dell, Lapinskas, and Meeks (SODA 2020) and reduces the total overhead to log^{Θ(k-α)}n. Our lower bound for the uncoloured setting shows that there is no fine-grained reduction from approximate counting to the corresponding uncoloured decision problem (except in the case α ≥ k-1): without an algorithm for the colourful decision problem, we cannot hope to avoid the much larger overhead of roughly n^{(k-α)²/4}. The uncoloured setting has previously been studied for the special case k = 2 (Peled, Ramamoorthy, Rashtchian, Sinha, ITCS 2018; Chen, Levi, and Waingarten, SODA 2020), and our work generalises the existing algorithms and lower bounds for this special case to k > 2 and to oracles with cost.

Cite as

Holger Dell, John Lapinskas, and Kitty Meeks. Nearly Optimal Independence Oracle Algorithms for Edge Estimation in Hypergraphs. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 54:1-54:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{dell_et_al:LIPIcs.ICALP.2024.54,
  author =	{Dell, Holger and Lapinskas, John and Meeks, Kitty},
  title =	{{Nearly Optimal Independence Oracle Algorithms for Edge Estimation in Hypergraphs}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{54:1--54:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.54},
  URN =		{urn:nbn:de:0030-drops-201977},
  doi =		{10.4230/LIPIcs.ICALP.2024.54},
  annote =	{Keywords: Graph oracles, Fine-grained complexity, Approximate counting, Hypergraphs}
}
Document
Track A: Algorithms, Complexity and Games
Kernelization Dichotomies for Hitting Subgraphs Under Structural Parameterizations

Authors: Marin Bougeret, Bart M. P. Jansen, and Ignasi Sau

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
For a fixed graph H, the H-Subgraph Hitting problem consists in deleting the minimum number of vertices from an input graph to obtain a graph without any occurrence of H as a subgraph. This problem can be seen as a generalization of Vertex Cover, which corresponds to the case H = K₂. We initiate a study of H-Subgraph Hitting from the point of view of characterizing structural parameterizations that allow for polynomial kernels, within the recently active framework of taking as the parameter the number of vertex deletions to obtain a graph in a "simple" class 𝒞. Our main contribution is to identify graph parameters that, when H-Subgraph Hitting is parameterized by the vertex-deletion distance to a class 𝒞 where any of these parameters is bounded, and assuming standard complexity assumptions and that H is biconnected, allow us to prove the following sharp dichotomy: the problem admits a polynomial kernel if and only if H is a clique. These new graph parameters are inspired by the notion of 𝒞-elimination distance introduced by Bulian and Dawar [Algorithmica 2016], and generalize it in two directions. Our results also apply to the version of the problem where one wants to hit H as an induced subgraph, and imply in particular, that the problems of hitting minors and hitting (induced) subgraphs have a substantially different behavior with respect to the existence of polynomial kernels under structural parameterizations.

Cite as

Marin Bougeret, Bart M. P. Jansen, and Ignasi Sau. Kernelization Dichotomies for Hitting Subgraphs Under Structural Parameterizations. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 33:1-33:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{bougeret_et_al:LIPIcs.ICALP.2024.33,
  author =	{Bougeret, Marin and Jansen, Bart M. P. and Sau, Ignasi},
  title =	{{Kernelization Dichotomies for Hitting Subgraphs Under Structural Parameterizations}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{33:1--33:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.33},
  URN =		{urn:nbn:de:0030-drops-201766},
  doi =		{10.4230/LIPIcs.ICALP.2024.33},
  annote =	{Keywords: hitting subgraphs, hitting induced subgraphs, parameterized complexity, polynomial kernel, complexity dichotomy, elimination distance}
}
Document
Track A: Algorithms, Complexity and Games
Fundamental Problems on Bounded-Treewidth Graphs: The Real Source of Hardness

Authors: Barış Can Esmer, Jacob Focke, Dániel Marx, and Paweł Rzążewski

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
It is known for many algorithmic problems that if a tree decomposition of width t is given in the input, then the problem can be solved with exponential dependence on t. A line of research initiated by Lokshtanov, Marx, and Saurabh [SODA 2011] produced lower bounds showing that in many cases known algorithms already achieve the best possible exponential dependence on t, assuming the Strong Exponential-Time Hypothesis (SETH). The main message of this paper is showing that the same lower bounds can already be obtained in a much more restricted setting: informally, a graph consisting of a block of t vertices connected to components of constant size already has the same hardness as a general tree decomposition of width t. Formally, a (σ,δ)-hub is a set Q of vertices such that every component of Q has size at most σ and is adjacent to at most δ vertices of Q. We explore if the known tight lower bounds parameterized by the width of the given tree decomposition remain valid if we parameterize by the size of the given hub. - For every ε > 0, there are σ,δ > 0 such that Independent Set (equivalently Vertex Cover) cannot be solved in time (2-ε)^p⋅ n, even if a (σ, δ)-hub of size p is given in the input, assuming the SETH. This matches the earlier tight lower bounds parameterized by width of the tree decomposition. Similar tight bounds are obtained for Odd Cycle Transversal, Max Cut, q-Coloring, and edge/vertex deletions versions of q-Coloring. - For every ε > 0, there are σ,δ > 0 such that △-Partition cannot be solved in time (2-ε)^p ⋅ n, even if a (σ, δ)-hub of size p is given in the input, assuming the Set Cover Conjecture (SCC). In fact, we prove that this statement is equivalent to the SCC, thus it is unlikely that this could be proved assuming the SETH. - For Dominating Set, we can prove a non-tight lower bound ruling out (2-ε)^p ⋅ n^𝒪(1) algorithms, assuming either the SETH or the SCC, but this does not match the 3^p⋅ n^{𝒪(1)} upper bound. Thus our results reveal that, for many problems, the research on lower bounds on the dependence on tree width was never really about tree decompositions, but the real source of hardness comes from a much simpler structure. Additionally, we study if the same lower bounds can be obtained if σ and δ are fixed universal constants (not depending on ε). We show that lower bounds of this form are possible for Max Cut and the edge-deletion version of q-Coloring, under the Max 3-Sat Hypothesis (M3SH). However, no such lower bounds are possible for Independent Set, Odd Cycle Transversal, and the vertex-deletion version of q-Coloring: better than brute force algorithms are possible for every fixed (σ,δ).

Cite as

Barış Can Esmer, Jacob Focke, Dániel Marx, and Paweł Rzążewski. Fundamental Problems on Bounded-Treewidth Graphs: The Real Source of Hardness. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 34:1-34:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{canesmer_et_al:LIPIcs.ICALP.2024.34,
  author =	{Can Esmer, Bar{\i}\c{s} and Focke, Jacob and Marx, D\'{a}niel and Rz\k{a}\.{z}ewski, Pawe{\l}},
  title =	{{Fundamental Problems on Bounded-Treewidth Graphs: The Real Source of Hardness}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{34:1--34:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.34},
  URN =		{urn:nbn:de:0030-drops-201772},
  doi =		{10.4230/LIPIcs.ICALP.2024.34},
  annote =	{Keywords: Parameterized Complexity, Tight Bounds, Hub, Treewidth, Strong Exponential Time Hypothesis, Vertex Coloring, Vertex Deletion, Edge Deletion, Triangle Packing, Triangle Partition, Set Cover Hypothesis, Dominating Set}
}
Document
Track A: Algorithms, Complexity and Games
A Tight Subexponential-Time Algorithm for Two-Page Book Embedding

Authors: Robert Ganian, Haiko Müller, Sebastian Ordyniak, Giacomo Paesani, and Mateusz Rychlicki

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
A book embedding of a graph is a drawing that maps vertices onto a line and edges to simple pairwise non-crossing curves drawn into "pages", which are half-planes bounded by that line. Two-page book embeddings, i.e., book embeddings into 2 pages, are of special importance as they are both NP-hard to compute and have specific applications. We obtain a 2^𝒪(√n) algorithm for computing a book embedding of an n-vertex graph on two pages - a result which is asymptotically tight under the Exponential Time Hypothesis. As a key tool in our approach, we obtain a single-exponential fixed-parameter algorithm for the same problem when parameterized by the treewidth of the input graph. We conclude by establishing the fixed-parameter tractability of computing minimum-page book embeddings when parameterized by the feedback edge number, settling an open question arising from previous work on the problem.

Cite as

Robert Ganian, Haiko Müller, Sebastian Ordyniak, Giacomo Paesani, and Mateusz Rychlicki. A Tight Subexponential-Time Algorithm for Two-Page Book Embedding. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 68:1-68:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{ganian_et_al:LIPIcs.ICALP.2024.68,
  author =	{Ganian, Robert and M\"{u}ller, Haiko and Ordyniak, Sebastian and Paesani, Giacomo and Rychlicki, Mateusz},
  title =	{{A Tight Subexponential-Time Algorithm for Two-Page Book Embedding}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{68:1--68:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.68},
  URN =		{urn:nbn:de:0030-drops-202114},
  doi =		{10.4230/LIPIcs.ICALP.2024.68},
  annote =	{Keywords: book embedding, page number, subexponential algorithms, subhamiltonicity, feedback edge number}
}
Document
Track A: Algorithms, Complexity and Games
Solution Discovery via Reconfiguration for Problems in P

Authors: Mario Grobler, Stephanie Maaz, Nicole Megow, Amer E. Mouawad, Vijayaragunathan Ramamoorthi, Daniel Schmand, and Sebastian Siebertz

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
In the recently introduced framework of solution discovery via reconfiguration [Fellows et al., ECAI 2023], we are given an initial configuration of k tokens on a graph and the question is whether we can transform this configuration into a feasible solution (for some problem) via a bounded number b of small modification steps. In this work, we study solution discovery variants of polynomial-time solvable problems, namely Spanning Tree Discovery, Shortest Path Discovery, Matching Discovery, and Vertex/Edge Cut Discovery in the unrestricted token addition/removal model, the token jumping model, and the token sliding model. In the unrestricted token addition/removal model, we show that all four discovery variants remain in P. For the token jumping model we also prove containment in P, except for Vertex/Edge Cut Discovery, for which we prove NP-completeness. Finally, in the token sliding model, almost all considered problems become NP-complete, the exception being Spanning Tree Discovery, which remains polynomial-time solvable. We then study the parameterized complexity of the NP-complete problems and provide a full classification of tractability with respect to the parameters solution size (number of tokens) k and transformation budget (number of steps) b. Along the way, we observe strong connections between the solution discovery variants of our base problems and their (weighted) rainbow variants as well as their red-blue variants with cardinality constraints.

Cite as

Mario Grobler, Stephanie Maaz, Nicole Megow, Amer E. Mouawad, Vijayaragunathan Ramamoorthi, Daniel Schmand, and Sebastian Siebertz. Solution Discovery via Reconfiguration for Problems in P. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 76:1-76:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{grobler_et_al:LIPIcs.ICALP.2024.76,
  author =	{Grobler, Mario and Maaz, Stephanie and Megow, Nicole and Mouawad, Amer E. and Ramamoorthi, Vijayaragunathan and Schmand, Daniel and Siebertz, Sebastian},
  title =	{{Solution Discovery via Reconfiguration for Problems in P}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{76:1--76:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.76},
  URN =		{urn:nbn:de:0030-drops-202195},
  doi =		{10.4230/LIPIcs.ICALP.2024.76},
  annote =	{Keywords: solution discovery, reconfiguration, spanning tree, shortest path, matching, cut}
}
Document
Track A: Algorithms, Complexity and Games
No Polynomial Kernels for Knapsack

Authors: Klaus Heeger, Danny Hermelin, Matthias Mnich, and Dvir Shabtay

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
This paper focuses on kernelization algorithms for the fundamental Knapsack problem. A kernelization algorithm (or kernel) is a polynomial-time reduction from a problem onto itself, where the output size is bounded by a function of some problem-specific parameter. Such algorithms provide a theoretical model for data reduction and preprocessing and are central in the area of parameterized complexity. In this way, a kernel for Knapsack for some parameter k reduces any instance of Knapsack to an equivalent instance of size at most f(k) in polynomial time, for some computable function f. When f(k) = k^{O(1)} then we call such a reduction a polynomial kernel. Our study focuses on two natural parameters for Knapsack: The number w_# of different item weights, and the number p_# of different item profits. Our main technical contribution is a proof showing that Knapsack does not admit a polynomial kernel for any of these two parameters under standard complexity-theoretic assumptions. Our proof discovers an elaborate application of the standard kernelization lower bound framework, and develops along the way novel ideas that should be useful for other problems as well. We complement our lower bounds by showing that Knapsack admits a polynomial kernel for the combined parameter w_# ⋅ p_#.

Cite as

Klaus Heeger, Danny Hermelin, Matthias Mnich, and Dvir Shabtay. No Polynomial Kernels for Knapsack. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 83:1-83:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{heeger_et_al:LIPIcs.ICALP.2024.83,
  author =	{Heeger, Klaus and Hermelin, Danny and Mnich, Matthias and Shabtay, Dvir},
  title =	{{No Polynomial Kernels for Knapsack}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{83:1--83:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.83},
  URN =		{urn:nbn:de:0030-drops-202261},
  doi =		{10.4230/LIPIcs.ICALP.2024.83},
  annote =	{Keywords: Knapsack, polynomial kernels, compositions, number of different weights, number of different profits}
}
Document
Track A: Algorithms, Complexity and Games
Delineating Half-Integrality of the Erdős-Pósa Property for Minors: The Case of Surfaces

Authors: Christophe Paul, Evangelos Protopapas, Dimitrios M. Thilikos, and Sebastian Wiederrecht

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
In 1986 Robertson and Seymour proved a generalization of the seminal result of Erdős and Pósa on the duality of packing and covering cycles: A graph has the Erdős-Pósa property for minors if and only if it is planar. In particular, for every non-planar graph H they gave examples showing that the Erdős-Pósa property does not hold for H. Recently, Liu confirmed a conjecture of Thomas and showed that every graph has the half-integral Erdős-Pósa property for minors. Liu’s proof is non-constructive and to this date, with the exception of a small number of examples, no constructive proof is known. In this paper, we initiate the delineation of the half-integrality of the Erdős-Pósa property for minors. We conjecture that for every graph H, there exists a unique (up to a suitable equivalence relation on graph parameters) graph parameter EP_H such that H has the Erdős-Pósa property in a minor-closed graph class 𝒢 if and only if sup{EP_H(G) ∣ G ∈ 𝒢} is finite. We prove this conjecture for the class ℋ of Kuratowski-connected shallow-vortex minors by showing that, for every non-planar H ∈ ℋ, the parameter EP_H(G) is precisely the maximum order of a Robertson-Seymour counterexample to the Erdős-Pósa property of H which can be found as a minor in G. Our results are constructive and imply, for the first time, parameterized algorithms that find either a packing, or a cover, or one of the Robertson-Seymour counterexamples, certifying the existence of a half-integral packing for the graphs in ℋ.

Cite as

Christophe Paul, Evangelos Protopapas, Dimitrios M. Thilikos, and Sebastian Wiederrecht. Delineating Half-Integrality of the Erdős-Pósa Property for Minors: The Case of Surfaces. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 114:1-114:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{paul_et_al:LIPIcs.ICALP.2024.114,
  author =	{Paul, Christophe and Protopapas, Evangelos and Thilikos, Dimitrios M. and Wiederrecht, Sebastian},
  title =	{{Delineating Half-Integrality of the Erd\H{o}s-P\'{o}sa Property for Minors: The Case of Surfaces}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{114:1--114:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.114},
  URN =		{urn:nbn:de:0030-drops-202576},
  doi =		{10.4230/LIPIcs.ICALP.2024.114},
  annote =	{Keywords: Erd\H{o}s-P\'{o}sa property, Erd\H{o}s-P\'{o}sa pair, Graph parameters, Graph minors, Universal obstruction, Surface containment}
}
Document
A 1.9999-Approximation Algorithm for Vertex Cover on String Graphs

Authors: Daniel Lokshtanov, Fahad Panolan, Saket Saurabh, Jie Xue, and Meirav Zehavi

Published in: LIPIcs, Volume 293, 40th International Symposium on Computational Geometry (SoCG 2024)


Abstract
Vertex Cover is a fundamental optimization problem, and is among Karp’s 21 NP-complete problems. The problem aims to compute, for a given graph G, a minimum-size set S of vertices of G such that G - S contains no edge. Vertex Cover admits a simple polynomial-time 2-approximation algorithm, which is the best approximation ratio one can achieve in polynomial time, assuming the Unique Game Conjecture. However, on many restrictive graph classes, it is possible to obtain better-than-2 approximation in polynomial time (or even PTASes) for Vertex Cover. In the club of geometric intersection graphs, examples of such graph classes include unit-disk graphs, disk graphs, pseudo-disk graphs, rectangle graphs, etc. In this paper, we study Vertex Cover on the broadest class of geometric intersection graphs in the plane, known as string graphs, which are intersection graphs of any connected geometric objects in the plane. Our main result is a polynomial-time 1.9999-approximation algorithm for Vertex Cover on string graphs, breaking the natural 2 barrier. Prior to this work, no better-than-2 approximation (in polynomial time) was known even for special cases of string graphs, such as intersection graphs of segments. Our algorithm is simple, robust (in the sense that it does not require the geometric realization of the input string graph to be given), and also works for the weighted version of Vertex Cover. Due to a connection between approximation for Independent Set and approximation for Vertex Cover observed by Har-Peled, our result can be viewed as a first step towards obtaining constant-approximation algorithms for Independent Set on string graphs.

Cite as

Daniel Lokshtanov, Fahad Panolan, Saket Saurabh, Jie Xue, and Meirav Zehavi. A 1.9999-Approximation Algorithm for Vertex Cover on String Graphs. In 40th International Symposium on Computational Geometry (SoCG 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 293, pp. 72:1-72:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{lokshtanov_et_al:LIPIcs.SoCG.2024.72,
  author =	{Lokshtanov, Daniel and Panolan, Fahad and Saurabh, Saket and Xue, Jie and Zehavi, Meirav},
  title =	{{A 1.9999-Approximation Algorithm for Vertex Cover on String Graphs}},
  booktitle =	{40th International Symposium on Computational Geometry (SoCG 2024)},
  pages =	{72:1--72:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-316-4},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{293},
  editor =	{Mulzer, Wolfgang and Phillips, Jeff M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2024.72},
  URN =		{urn:nbn:de:0030-drops-200174},
  doi =		{10.4230/LIPIcs.SoCG.2024.72},
  annote =	{Keywords: vertex cover, geometric intersection graphs, approximation algorithms}
}
Document
Parameterized Approximation: Algorithms and Hardness (Dagstuhl Seminar 23291)

Authors: Karthik C. S., Parinya Chalermsook, Joachim Spoerhase, Meirav Zehavi, and Martin Herold

Published in: Dagstuhl Reports, Volume 13, Issue 7 (2024)


Abstract
Parameterization and approximation are two established approaches of coping with intractability in combinatorial optimization. In this Dagstuhl Seminar, we studied parameterized approximation as a relatively new algorithmic paradigm that combines these two popular research areas. In particular, we analyzed the solution quality (approximation ratio) as well as the running time of an algorithm in terms of a parameter that captures the "complexity" of a problem instance. While the field has grown and yielded some promising results, our understanding of the area is rather ad-hoc compared to our knowledge in approximation or parameterized algorithms alone. In this seminar, we brought together researchers from both communities in order to bridge this gap by accommodating the exchange and unification of scientific knowledge.

Cite as

Karthik C. S., Parinya Chalermsook, Joachim Spoerhase, Meirav Zehavi, and Martin Herold. Parameterized Approximation: Algorithms and Hardness (Dagstuhl Seminar 23291). In Dagstuhl Reports, Volume 13, Issue 7, pp. 96-107, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@Article{c.s._et_al:DagRep.13.7.96,
  author =	{C. S., Karthik and Chalermsook, Parinya and Spoerhase, Joachim and Zehavi, Meirav and Herold, Martin},
  title =	{{Parameterized Approximation: Algorithms and Hardness (Dagstuhl Seminar 23291)}},
  pages =	{96--107},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2024},
  volume =	{13},
  number =	{7},
  editor =	{C. S., Karthik and Chalermsook, Parinya and Spoerhase, Joachim and Zehavi, Meirav and Herold, Martin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagRep.13.7.96},
  URN =		{urn:nbn:de:0030-drops-197764},
  doi =		{10.4230/DagRep.13.7.96},
  annote =	{Keywords: approximation algorithms, Hardness of approximation, Parameterized algorithms}
}
Document
Kernelization of Counting Problems

Authors: Daniel Lokshtanov, Pranabendu Misra, Saket Saurabh, and Meirav Zehavi

Published in: LIPIcs, Volume 287, 15th Innovations in Theoretical Computer Science Conference (ITCS 2024)


Abstract
We introduce a new framework for the analysis of preprocessing routines for parameterized counting problems. Existing frameworks that encapsulate parameterized counting problems permit the usage of exponential (rather than polynomial) time either explicitly or by implicitly reducing the counting problems to enumeration problems. Thus, our framework is the only one in the spirit of classic kernelization (as well as lossy kernelization). Specifically, we define a compression of a counting problem P into a counting problem Q as a pair of polynomial-time procedures: reduce and lift. Given an instance of P, reduce outputs an instance of Q whose size is bounded by a function f of the parameter, and given the number of solutions to the instance of Q, lift outputs the number of solutions to the instance of P. When P = Q, compression is termed kernelization, and when f is polynomial, compression is termed polynomial compression. Our technical (and other conceptual) contributions can be classified into two categories: Upper Bounds. We prove two theorems: (i) The #Vertex Cover problem parameterized by solution size admits a polynomial kernel; (ii) Every problem in the class of #Planar F-Deletion problems parameterized by solution size admits a polynomial compression. Lower Bounds. We introduce two new concepts of cross-compositions: EXACT-cross-composition and SUM-cross-composition. We prove that if a #P-hard counting problem P EXACT-cross-composes into a parameterized counting problem Q, then Q does not admit a polynomial compression unless the polynomial hierarchy collapses. We conjecture that the same statement holds for SUM-cross-compositions. Then, we prove that: (i) #Min (s,t)-Cut parameterized by treewidth does not admit a polynomial compression unless the polynomial hierarchy collapses; (ii) #Min (s,t)-Cut parameterized by minimum cut size, #Odd Cycle Transversal parameterized by solution size, and #Vertex Cover parameterized by solution size minus maximum matching size, do not admit polynomial compressions unless our conjecture is false.

Cite as

Daniel Lokshtanov, Pranabendu Misra, Saket Saurabh, and Meirav Zehavi. Kernelization of Counting Problems. In 15th Innovations in Theoretical Computer Science Conference (ITCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 287, pp. 77:1-77:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{lokshtanov_et_al:LIPIcs.ITCS.2024.77,
  author =	{Lokshtanov, Daniel and Misra, Pranabendu and Saurabh, Saket and Zehavi, Meirav},
  title =	{{Kernelization of Counting Problems}},
  booktitle =	{15th Innovations in Theoretical Computer Science Conference (ITCS 2024)},
  pages =	{77:1--77:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-309-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{287},
  editor =	{Guruswami, Venkatesan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2024.77},
  URN =		{urn:nbn:de:0030-drops-196059},
  doi =		{10.4230/LIPIcs.ITCS.2024.77},
  annote =	{Keywords: Kernelization, Counting Problems}
}
Document
Kernels for the Disjoint Paths Problem on Subclasses of Chordal Graphs

Authors: Juhi Chaudhary, Harmender Gahlawat, Michal Włodarczyk, and Meirav Zehavi

Published in: LIPIcs, Volume 285, 18th International Symposium on Parameterized and Exact Computation (IPEC 2023)


Abstract
Given an undirected graph G and a multiset of k terminal pairs 𝒳, the Vertex-Disjoint Paths (VDP) and Edge-Disjoint Paths (EDP) problems ask whether G has k pairwise internally vertex-disjoint paths and k pairwise edge-disjoint paths, respectively, connecting every terminal pair in 𝒳. In this paper, we study the kernelization complexity of VDP and EDP on subclasses of chordal graphs. For VDP, we design a 4k vertex kernel on split graphs and an 𝒪(k²) vertex kernel on well-partitioned chordal graphs. We also show that the problem becomes polynomial-time solvable on threshold graphs. For EDP, we first prove that the problem is NP-complete on complete graphs. Then, we design an 𝒪(k^{2.75}) vertex kernel for EDP on split graphs, and improve it to a 7k+1 vertex kernel on threshold graphs. Lastly, we provide an 𝒪(k²) vertex kernel for EDP on block graphs and a 2k+1 vertex kernel for clique paths. Our contributions improve upon several results in the literature, as well as resolve an open question by Heggernes et al. [Theory Comput. Syst., 2015].

Cite as

Juhi Chaudhary, Harmender Gahlawat, Michal Włodarczyk, and Meirav Zehavi. Kernels for the Disjoint Paths Problem on Subclasses of Chordal Graphs. In 18th International Symposium on Parameterized and Exact Computation (IPEC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 285, pp. 10:1-10:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{chaudhary_et_al:LIPIcs.IPEC.2023.10,
  author =	{Chaudhary, Juhi and Gahlawat, Harmender and W{\l}odarczyk, Michal and Zehavi, Meirav},
  title =	{{Kernels for the Disjoint Paths Problem on Subclasses of Chordal Graphs}},
  booktitle =	{18th International Symposium on Parameterized and Exact Computation (IPEC 2023)},
  pages =	{10:1--10:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-305-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{285},
  editor =	{Misra, Neeldhara and Wahlstr\"{o}m, Magnus},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2023.10},
  URN =		{urn:nbn:de:0030-drops-194296},
  doi =		{10.4230/LIPIcs.IPEC.2023.10},
  annote =	{Keywords: Kernelization, Parameterized Complexity, Vertex-Disjoint Paths Problem, Edge-Disjoint Paths Problem}
}
  • Refine by Author
  • 59 Zehavi, Meirav
  • 37 Saurabh, Saket
  • 23 Lokshtanov, Daniel
  • 14 Fomin, Fedor V.
  • 11 Agrawal, Akanksha
  • Show More...

  • Refine by Classification
  • 38 Theory of computation → Parameterized complexity and exact algorithms
  • 23 Theory of computation → Fixed parameter tractability
  • 19 Mathematics of computing → Graph algorithms
  • 10 Theory of computation → Computational geometry
  • 9 Theory of computation → Design and analysis of algorithms
  • Show More...

  • Refine by Keyword
  • 19 Parameterized Complexity
  • 14 parameterized complexity
  • 11 kernelization
  • 9 Kernelization
  • 4 fixed-parameter tractability
  • Show More...

  • Refine by Type
  • 105 document
  • 1 volume

  • Refine by Publication Year
  • 41 2021
  • 13 2024
  • 11 2019
  • 11 2023
  • 9 2020
  • Show More...