60 Search Results for "Jansen, Bart M. P."


Volume

LIPIcs, Volume 148

14th International Symposium on Parameterized and Exact Computation (IPEC 2019)

IPEC 2019, September 11-13, 2019, Munich, Germany

Editors: Bart M. P. Jansen and Jan Arne Telle

Document
Kernelization for Counting Problems on Graphs: Preserving the Number of Minimum Solutions

Authors: Bart M. P. Jansen and Bart van der Steenhoven

Published in: LIPIcs, Volume 285, 18th International Symposium on Parameterized and Exact Computation (IPEC 2023)


Abstract
A kernelization for a parameterized decision problem 𝒬 is a polynomial-time preprocessing algorithm that reduces any parameterized instance (x,k) into an instance (x',k') whose size is bounded by a function of k alone and which has the same yes/no answer for 𝒬. Such preprocessing algorithms cannot exist in the context of counting problems, when the answer to be preserved is the number of solutions, since this number can be arbitrarily large compared to k. However, we show that for counting minimum feedback vertex sets of size at most k, and for counting minimum dominating sets of size at most k in a planar graph, there is a polynomial-time algorithm that either outputs the answer or reduces to an instance (G',k') of size polynomial in k with the same number of minimum solutions. This shows that a meaningful theory of kernelization for counting problems is possible and opens the door for future developments. Our algorithms exploit that if the number of solutions exceeds 2^{poly(k)}, the size of the input is exponential in terms of k so that the running time of a parameterized counting algorithm can be bounded by poly(n). Otherwise, we can use gadgets that slightly increase k to represent choices among 2^{𝒪(k)} options by only poly(k) vertices.

Cite as

Bart M. P. Jansen and Bart van der Steenhoven. Kernelization for Counting Problems on Graphs: Preserving the Number of Minimum Solutions. In 18th International Symposium on Parameterized and Exact Computation (IPEC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 285, pp. 27:1-27:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.IPEC.2023.27,
  author =	{Jansen, Bart M. P. and van der Steenhoven, Bart},
  title =	{{Kernelization for Counting Problems on Graphs: Preserving the Number of Minimum Solutions}},
  booktitle =	{18th International Symposium on Parameterized and Exact Computation (IPEC 2023)},
  pages =	{27:1--27:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-305-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{285},
  editor =	{Misra, Neeldhara and Wahlstr\"{o}m, Magnus},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2023.27},
  URN =		{urn:nbn:de:0030-drops-194466},
  doi =		{10.4230/LIPIcs.IPEC.2023.27},
  annote =	{Keywords: kernelization, counting problems, feedback vertex set, dominating set, protrusion decomposition}
}
Document
On the Parameterized Complexity of Multiway Near-Separator

Authors: Bart M. P. Jansen and Shivesh K. Roy

Published in: LIPIcs, Volume 285, 18th International Symposium on Parameterized and Exact Computation (IPEC 2023)


Abstract
We study a new graph separation problem called Multiway Near-Separator. Given an undirected graph G, integer k, and terminal set T ⊆ V(G), it asks whether there is a vertex set S ⊆ V(G) ⧵ T of size at most k such that in graph G-S, no pair of distinct terminals can be connected by two pairwise internally vertex-disjoint paths. Hence each terminal pair can be separated in G-S by removing at most one vertex. The problem is therefore a generalization of (Node) Multiway Cut, which asks for a vertex set for which each terminal is in a different component of G-S. We develop a fixed-parameter tractable algorithm for Multiway Near-Separator running in time 2^{𝒪(k log k)} ⋅ n^{𝒪(1)}. Our algorithm is based on a new pushing lemma for solutions with respect to important separators, along with two problem-specific ingredients. The first is a polynomial-time subroutine to reduce the number of terminals in the instance to a polynomial in the solution size k plus the size of a given suboptimal solution. The second is a polynomial-time algorithm that, given a graph G and terminal set T ⊆ V(G) along with a single vertex x ∈ V(G) that forms a multiway near-separator, computes a 14-approximation for the problem of finding a multiway near-separator not containing x.

Cite as

Bart M. P. Jansen and Shivesh K. Roy. On the Parameterized Complexity of Multiway Near-Separator. In 18th International Symposium on Parameterized and Exact Computation (IPEC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 285, pp. 28:1-28:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.IPEC.2023.28,
  author =	{Jansen, Bart M. P. and Roy, Shivesh K.},
  title =	{{On the Parameterized Complexity of Multiway Near-Separator}},
  booktitle =	{18th International Symposium on Parameterized and Exact Computation (IPEC 2023)},
  pages =	{28:1--28:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-305-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{285},
  editor =	{Misra, Neeldhara and Wahlstr\"{o}m, Magnus},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2023.28},
  URN =		{urn:nbn:de:0030-drops-194470},
  doi =		{10.4230/LIPIcs.IPEC.2023.28},
  annote =	{Keywords: fixed-parameter tractability, multiway cut, near-separator}
}
Document
Sunflowers Meet Sparsity: A Linear-Vertex Kernel for Weighted Clique-Packing on Sparse Graphs

Authors: Bart M. P. Jansen and Shivesh K. Roy

Published in: LIPIcs, Volume 285, 18th International Symposium on Parameterized and Exact Computation (IPEC 2023)


Abstract
We study the kernelization complexity of the Weighted H-Packing problem on sparse graphs. For a fixed connected graph H, in the Weighted H-Packing problem the input is a graph G, a vertex-weight function w : V(G) → ℕ, and positive integers k, t. The question is whether there exist k vertex-disjoint subgraphs H₁, …, H_k of G such that H_i is isomorphic to H for each i ∈ [k] and the total weight of these k ⋅ |V(H)| vertices is at least t. It is known that the (unweighted) H-Packing problem admits a kernel with 𝒪(k^{|V(H)|-1}) vertices on general graphs, and a linear kernel on planar graphs and graphs of bounded genus. In this work, we focus on case that H is a clique on h ≥ 3 vertices (which captures Triangle Packing) and present a linear-vertex kernel for Weighted K_h-Packing on graphs of bounded expansion, along with a kernel with 𝒪(k^{1+ε}) vertices on nowhere-dense graphs for all ε > 0. To obtain these results, we combine two powerful ingredients in a novel way: the Erdős-Rado Sunflower lemma and the theory of sparsity.

Cite as

Bart M. P. Jansen and Shivesh K. Roy. Sunflowers Meet Sparsity: A Linear-Vertex Kernel for Weighted Clique-Packing on Sparse Graphs. In 18th International Symposium on Parameterized and Exact Computation (IPEC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 285, pp. 29:1-29:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.IPEC.2023.29,
  author =	{Jansen, Bart M. P. and Roy, Shivesh K.},
  title =	{{Sunflowers Meet Sparsity: A Linear-Vertex Kernel for Weighted Clique-Packing on Sparse Graphs}},
  booktitle =	{18th International Symposium on Parameterized and Exact Computation (IPEC 2023)},
  pages =	{29:1--29:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-305-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{285},
  editor =	{Misra, Neeldhara and Wahlstr\"{o}m, Magnus},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2023.29},
  URN =		{urn:nbn:de:0030-drops-194488},
  doi =		{10.4230/LIPIcs.IPEC.2023.29},
  annote =	{Keywords: kernelization, weighted problems, graph packing, sunflower lemma, bounded expansion, nowhere dense}
}
Document
Single-Exponential FPT Algorithms for Enumerating Secluded ℱ-Free Subgraphs and Deleting to Scattered Graph Classes

Authors: Bart M. P. Jansen, Jari J. H. de Kroon, and Michał Włodarczyk

Published in: LIPIcs, Volume 283, 34th International Symposium on Algorithms and Computation (ISAAC 2023)


Abstract
The celebrated notion of important separators bounds the number of small (S,T)-separators in a graph which are "farthest from S" in a technical sense. In this paper, we introduce a generalization of this powerful algorithmic primitive, tailored to undirected graphs, that is phrased in terms of k-secluded vertex sets: sets with an open neighborhood of size at most k. In this terminology, the bound on important separators says that there are at most 4^k maximal k-secluded connected vertex sets C containing S but disjoint from T. We generalize this statement significantly: even when we demand that G[C] avoids a finite set ℱ of forbidden induced subgraphs, the number of such maximal subgraphs is 2^𝒪(k) and they can be enumerated efficiently. This enumeration algorithm allows us to make significant improvements for two problems from the literature. Our first application concerns the Connected k-Secluded ℱ-free subgraph problem, where ℱ is a finite set of forbidden induced subgraphs. Given a graph in which each vertex has a positive integer weight, the problem asks to find a maximum-weight connected k-secluded vertex set C ⊆ V(G) such that G[C] does not contain an induced subgraph isomorphic to any F ∈ ℱ. The parameterization by k is known to be solvable in triple-exponential time via the technique of recursive understanding, which we improve to single-exponential. Our second application concerns the deletion problem to scattered graph classes. A scattered graph class is defined by demanding that every connected component is contained in at least one of the prescribed graph classes Π_1, …, Π_d. The deletion problem to a scattered graph class is to find a vertex set of size at most k whose removal yields a graph from the class. We obtain a single-exponential algorithm whenever each class Π_i is characterized by a finite number of forbidden induced subgraphs. This generalizes and improves upon earlier results in the literature.

Cite as

Bart M. P. Jansen, Jari J. H. de Kroon, and Michał Włodarczyk. Single-Exponential FPT Algorithms for Enumerating Secluded ℱ-Free Subgraphs and Deleting to Scattered Graph Classes. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 42:1-42:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.ISAAC.2023.42,
  author =	{Jansen, Bart M. P. and de Kroon, Jari J. H. and W{\l}odarczyk, Micha{\l}},
  title =	{{Single-Exponential FPT Algorithms for Enumerating Secluded ℱ-Free Subgraphs and Deleting to Scattered Graph Classes}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{42:1--42:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.42},
  URN =		{urn:nbn:de:0030-drops-193440},
  doi =		{10.4230/LIPIcs.ISAAC.2023.42},
  annote =	{Keywords: fixed-parameter tractability, important separators, secluded subgraphs}
}
Document
5-Approximation for ℋ-Treewidth Essentially as Fast as ℋ-Deletion Parameterized by Solution Size

Authors: Bart M. P. Jansen, Jari J. H. de Kroon, and Michał Włodarczyk

Published in: LIPIcs, Volume 274, 31st Annual European Symposium on Algorithms (ESA 2023)


Abstract
The notion of ℋ-treewidth, where ℋ is a hereditary graph class, was recently introduced as a generalization of the treewidth of an undirected graph. Roughly speaking, a graph of ℋ-treewidth at most k can be decomposed into (arbitrarily large) ℋ-subgraphs which interact only through vertex sets of size 𝒪(k) which can be organized in a tree-like fashion. ℋ-treewidth can be used as a hybrid parameterization to develop fixed-parameter tractable algorithms for ℋ-deletion problems, which ask to find a minimum vertex set whose removal from a given graph G turns it into a member of ℋ. The bottleneck in the current parameterized algorithms lies in the computation of suitable tree ℋ-decompositions. We present FPT-approximation algorithms to compute tree ℋ-decompositions for hereditary and union-closed graph classes ℋ. Given a graph of ℋ-treewidth k, we can compute a 5-approximate tree ℋ-decomposition in time f(𝒪(k)) ⋅ n^𝒪(1) whenever ℋ-deletion parameterized by solution size can be solved in time f(k) ⋅ n^𝒪(1) for some function f(k) ≥ 2^k. The current-best algorithms either achieve an approximation factor of k^𝒪(1) or construct optimal decompositions while suffering from non-uniformity with unknown parameter dependence. Using these decompositions, we obtain algorithms solving Odd Cycle Transversal in time 2^𝒪(k) ⋅ n^𝒪(1) parameterized by bipartite-treewidth and Vertex Planarization in time 2^𝒪(k log k) ⋅ n^𝒪(1) parameterized by planar-treewidth, showing that these can be as fast as the solution-size parameterizations and giving the first ETH-tight algorithms for parameterizations by hybrid width measures.

Cite as

Bart M. P. Jansen, Jari J. H. de Kroon, and Michał Włodarczyk. 5-Approximation for ℋ-Treewidth Essentially as Fast as ℋ-Deletion Parameterized by Solution Size. In 31st Annual European Symposium on Algorithms (ESA 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 274, pp. 66:1-66:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.ESA.2023.66,
  author =	{Jansen, Bart M. P. and de Kroon, Jari J. H. and W{\l}odarczyk, Micha{\l}},
  title =	{{5-Approximation for ℋ-Treewidth Essentially as Fast as ℋ-Deletion Parameterized by Solution Size}},
  booktitle =	{31st Annual European Symposium on Algorithms (ESA 2023)},
  pages =	{66:1--66:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-295-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{274},
  editor =	{G{\o}rtz, Inge Li and Farach-Colton, Martin and Puglisi, Simon J. and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2023.66},
  URN =		{urn:nbn:de:0030-drops-187195},
  doi =		{10.4230/LIPIcs.ESA.2023.66},
  annote =	{Keywords: fixed-parameter tractability, treewidth, graph decompositions}
}
Document
Search-Space Reduction via Essential Vertices

Authors: Benjamin Merlin Bumpus, Bart M. P. Jansen, and Jari J. H. de Kroon

Published in: LIPIcs, Volume 244, 30th Annual European Symposium on Algorithms (ESA 2022)


Abstract
We investigate preprocessing for vertex-subset problems on graphs. While the notion of kernelization, originating in parameterized complexity theory, is a formalization of provably effective preprocessing aimed at reducing the total instance size, our focus is on finding a non-empty vertex set that belongs to an optimal solution. This decreases the size of the remaining part of the solution which still has to be found, and therefore shrinks the search space of fixed-parameter tractable algorithms for parameterizations based on the solution size. We introduce the notion of a c-essential vertex as one that is contained in all c-approximate solutions. For several classic combinatorial problems such as Odd Cycle Transversal and Directed Feedback Vertex Set, we show that under mild conditions a polynomial-time preprocessing algorithm can find a subset of an optimal solution that contains all 2-essential vertices, by exploiting packing/covering duality. This leads to FPT algorithms to solve these problems where the exponential term in the running time depends only on the number of non-essential vertices in the solution.

Cite as

Benjamin Merlin Bumpus, Bart M. P. Jansen, and Jari J. H. de Kroon. Search-Space Reduction via Essential Vertices. In 30th Annual European Symposium on Algorithms (ESA 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 244, pp. 30:1-30:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bumpus_et_al:LIPIcs.ESA.2022.30,
  author =	{Bumpus, Benjamin Merlin and Jansen, Bart M. P. and de Kroon, Jari J. H.},
  title =	{{Search-Space Reduction via Essential Vertices}},
  booktitle =	{30th Annual European Symposium on Algorithms (ESA 2022)},
  pages =	{30:1--30:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-247-1},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{244},
  editor =	{Chechik, Shiri and Navarro, Gonzalo and Rotenberg, Eva and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2022.30},
  URN =		{urn:nbn:de:0030-drops-169687},
  doi =		{10.4230/LIPIcs.ESA.2022.30},
  annote =	{Keywords: fixed-parameter tractability, essential vertices, covering versus packing}
}
Document
Twin-Width and Polynomial Kernels

Authors: Édouard Bonnet, Eun Jung Kim, Amadeus Reinald, Stéphan Thomassé, and Rémi Watrigant

Published in: LIPIcs, Volume 214, 16th International Symposium on Parameterized and Exact Computation (IPEC 2021)


Abstract
We study the existence of polynomial kernels for parameterized problems without a polynomial kernel on general graphs, when restricted to graphs of bounded twin-width. It was previously observed in [Bonnet et al., ICALP'21] that the problem k-Independent Set allows no polynomial kernel on graph of bounded twin-width by a very simple argument, which extends to several other problems such as k-Independent Dominating Set, k-Path, k-Induced Path, k-Induced Matching. In this work, we examine the k-Dominating Set and variants of k-Vertex Cover for the existence of polynomial kernels. As a main result, we show that k-Dominating Set does not admit a polynomial kernel on graphs of twin-width at most 4 under a standard complexity-theoretic assumption. The reduction is intricate, especially due to the effort to bring the twin-width down to 4, and it can be tweaked to work for Connected k-Dominating Set and Total k-Dominating Set with a slightly worse bound on the twin-width. On the positive side, we obtain a simple quadratic vertex kernel for Connected k-Vertex Cover and Capacitated k-Vertex Cover on graphs of bounded twin-width. These kernels rely on that graphs of bounded twin-width have Vapnik-Chervonenkis (VC) density 1, that is, for any vertex set X, the number of distinct neighborhoods in X is at most c⋅|X|, where c is a constant depending only on the twin-width. Interestingly the kernel applies to any graph class of VC density 1, and does not require a witness sequence. We also present a more intricate O(k^{1.5}) vertex kernel for Connected k-Vertex Cover. Finally we show that deciding if a graph has twin-width at most 1 can be done in polynomial time, and observe that most graph optimization/decision problems can be solved in polynomial time on graphs of twin-width at most 1.

Cite as

Édouard Bonnet, Eun Jung Kim, Amadeus Reinald, Stéphan Thomassé, and Rémi Watrigant. Twin-Width and Polynomial Kernels. In 16th International Symposium on Parameterized and Exact Computation (IPEC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 214, pp. 10:1-10:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{bonnet_et_al:LIPIcs.IPEC.2021.10,
  author =	{Bonnet, \'{E}douard and Kim, Eun Jung and Reinald, Amadeus and Thomass\'{e}, St\'{e}phan and Watrigant, R\'{e}mi},
  title =	{{Twin-Width and Polynomial Kernels}},
  booktitle =	{16th International Symposium on Parameterized and Exact Computation (IPEC 2021)},
  pages =	{10:1--10:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-216-7},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{214},
  editor =	{Golovach, Petr A. and Zehavi, Meirav},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2021.10},
  URN =		{urn:nbn:de:0030-drops-153932},
  doi =		{10.4230/LIPIcs.IPEC.2021.10},
  annote =	{Keywords: Twin-width, kernelization, lower bounds, Dominating Set}
}
Document
Preprocessing for Outerplanar Vertex Deletion: An Elementary Kernel of Quartic Size

Authors: Huib Donkers, Bart M. P. Jansen, and Michał Włodarczyk

Published in: LIPIcs, Volume 214, 16th International Symposium on Parameterized and Exact Computation (IPEC 2021)


Abstract
In the ℱ-Minor-Free Deletion problem one is given an undirected graph G, an integer k, and the task is to determine whether there exists a vertex set S of size at most k, so that G-S contains no graph from the finite family ℱ as a minor. It is known that whenever ℱ contains at least one planar graph, then ℱ-Minor-Free Deletion admits a polynomial kernel, that is, there is a polynomial-time algorithm that outputs an equivalent instance of size k^{𝒪(1)} [Fomin, Lokshtanov, Misra, Saurabh; FOCS 2012]. However, this result relies on non-constructive arguments based on well-quasi-ordering and does not provide a concrete bound on the kernel size. We study the Outerplanar Deletion problem, in which we want to remove at most k vertices from a graph to make it outerplanar. This is a special case of ℱ-Minor-Free Deletion for the family ℱ = {K₄, K_{2,3}}. The class of outerplanar graphs is arguably the simplest class of graphs for which no explicit kernelization size bounds are known. By exploiting the combinatorial properties of outerplanar graphs we present elementary reduction rules decreasing the size of a graph. This yields a constructive kernel with 𝒪(k⁴) vertices and edges. As a corollary, we derive that any minor-minimal obstruction to having an outerplanar deletion set of size k has 𝒪(k⁴) vertices and edges.

Cite as

Huib Donkers, Bart M. P. Jansen, and Michał Włodarczyk. Preprocessing for Outerplanar Vertex Deletion: An Elementary Kernel of Quartic Size. In 16th International Symposium on Parameterized and Exact Computation (IPEC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 214, pp. 14:1-14:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{donkers_et_al:LIPIcs.IPEC.2021.14,
  author =	{Donkers, Huib and Jansen, Bart M. P. and W{\l}odarczyk, Micha{\l}},
  title =	{{Preprocessing for Outerplanar Vertex Deletion: An Elementary Kernel of Quartic Size}},
  booktitle =	{16th International Symposium on Parameterized and Exact Computation (IPEC 2021)},
  pages =	{14:1--14:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-216-7},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{214},
  editor =	{Golovach, Petr A. and Zehavi, Meirav},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2021.14},
  URN =		{urn:nbn:de:0030-drops-153979},
  doi =		{10.4230/LIPIcs.IPEC.2021.14},
  annote =	{Keywords: fixed-parameter tractability, kernelization, outerplanar graphs}
}
Document
On the Hardness of Compressing Weights

Authors: Bart M. P. Jansen, Shivesh K. Roy, and Michał Włodarczyk

Published in: LIPIcs, Volume 202, 46th International Symposium on Mathematical Foundations of Computer Science (MFCS 2021)


Abstract
We investigate computational problems involving large weights through the lens of kernelization, which is a framework of polynomial-time preprocessing aimed at compressing the instance size. Our main focus is the weighted Clique problem, where we are given an edge-weighted graph and the goal is to detect a clique of total weight equal to a prescribed value. We show that the weighted variant, parameterized by the number of vertices n, is significantly harder than the unweighted problem by presenting an 𝒪(n^{3 - ε}) lower bound on the size of the kernel, under the assumption that NP ̸ ⊆ coNP/poly. This lower bound is essentially tight: we show that we can reduce the problem to the case with weights bounded by 2^𝒪(n), which yields a randomized kernel of 𝒪(n³) bits. We generalize these results to the weighted d-Uniform Hyperclique problem, Subset Sum, and weighted variants of Boolean Constraint Satisfaction Problems (CSPs). We also study weighted minimization problems and show that weight compression is easier when we only want to {preserve the collection of} optimal solutions. Namely, we show that for node-weighted Vertex Cover on bipartite graphs it is possible to maintain the set of optimal solutions using integer weights from the range [1, n], but if we want to maintain the ordering of the weights of all inclusion-minimal solutions, then weights as large as 2^Ω(n) are necessary.

Cite as

Bart M. P. Jansen, Shivesh K. Roy, and Michał Włodarczyk. On the Hardness of Compressing Weights. In 46th International Symposium on Mathematical Foundations of Computer Science (MFCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 202, pp. 64:1-64:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.MFCS.2021.64,
  author =	{Jansen, Bart M. P. and Roy, Shivesh K. and W{\l}odarczyk, Micha{\l}},
  title =	{{On the Hardness of Compressing Weights}},
  booktitle =	{46th International Symposium on Mathematical Foundations of Computer Science (MFCS 2021)},
  pages =	{64:1--64:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-201-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{202},
  editor =	{Bonchi, Filippo and Puglisi, Simon J.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2021.64},
  URN =		{urn:nbn:de:0030-drops-145049},
  doi =		{10.4230/LIPIcs.MFCS.2021.64},
  annote =	{Keywords: kernelization, compression, edge-weighted clique, constraint satisfaction problems}
}
Document
Fine-Grained Complexity of the List Homomorphism Problem: Feedback Vertex Set and Cutwidth

Authors: Marta Piecyk and Paweł Rzążewski

Published in: LIPIcs, Volume 187, 38th International Symposium on Theoretical Aspects of Computer Science (STACS 2021)


Abstract
For graphs G,H, a homomorphism from G to H is an edge-preserving mapping from V(G) to V(H). In the list homomorphism problem, denoted by LHom(H), we are given a graph G, whose every vertex v is equipped with a list L(v) ⊆ V(H), and we need to determine whether there exists a homomorphism from G to H which additionally respects the lists L. List homomorphisms are a natural generalization of (list) colorings. Very recently Okrasa, Piecyk, and Rzążewski [ESA 2020] studied the fine-grained complexity of the problem, parameterized by the treewidth of the instance graph G. They defined a new invariant i^*(H), and proved that for every relevant graph H, i.e., such that LHom(H) is NP-hard, this invariant is the correct base of the exponent in the running time of any algorithm solving the LHom(H) problem. In this paper we continue this direction and study the complexity of the problem under different parameterizations. As the first result, we show that i^*(H) is also the right complexity base if the parameter is the size of a minimum feedback vertex set of G, denoted by fvs(G). In particular, for every relevant graph H, the LHom(H) problem - can be solved in time i^*(H)^fvs(G) ⋅ |V(G)|^𝒪(1), if a minimum feedback vertex set of G is given, - cannot be solved in time (i^*(H) - ε)^fvs(G) ⋅ |V(G)|^𝒪(1), for any ε > 0, unless the SETH fails. Then we turn our attention to a parameterization by the cutwidth ctw(G) of G. Jansen and Nederlof [TCS 2019] showed that List k-Coloring (i.e., LHom(K_k)) can be solved in time c^ctw(G) ⋅ |V(G)|^𝒪(1) for an absolute constant c, i.e., the base of the exponential function does not depend on the number of colors. Jansen asked whether this behavior extends to graph homomorphisms. As the main result of the paper, we answer the question in the negative. We define a new graph invariant mim^*(H), closely related to the size of a maximum induced matching in H, and prove that for all relevant graphs H, the LHom(H) problem cannot be solved in time (mim^*(H)-ε)^{ctw(G)}⋅ |V(G)|^𝒪(1) for any ε > 0, unless the SETH fails. In particular, this implies that, assuming the SETH, there is no constant c, such that for every odd cycle the non-list version of the problem can be solved in time c^ctw(G) ⋅ |V(G)|^𝒪(1).

Cite as

Marta Piecyk and Paweł Rzążewski. Fine-Grained Complexity of the List Homomorphism Problem: Feedback Vertex Set and Cutwidth. In 38th International Symposium on Theoretical Aspects of Computer Science (STACS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 187, pp. 56:1-56:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{piecyk_et_al:LIPIcs.STACS.2021.56,
  author =	{Piecyk, Marta and Rz\k{a}\.{z}ewski, Pawe{\l}},
  title =	{{Fine-Grained Complexity of the List Homomorphism Problem: Feedback Vertex Set and Cutwidth}},
  booktitle =	{38th International Symposium on Theoretical Aspects of Computer Science (STACS 2021)},
  pages =	{56:1--56:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-180-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{187},
  editor =	{Bl\"{a}ser, Markus and Monmege, Benjamin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2021.56},
  URN =		{urn:nbn:de:0030-drops-137012},
  doi =		{10.4230/LIPIcs.STACS.2021.56},
  annote =	{Keywords: list homomorphisms, fine-grained complexity, SETH, feedback vertex set, cutwidth}
}
Document
Sparsification Lower Bounds for List H-Coloring

Authors: Hubie Chen, Bart M. P. Jansen, Karolina Okrasa, Astrid Pieterse, and Paweł Rzążewski

Published in: LIPIcs, Volume 181, 31st International Symposium on Algorithms and Computation (ISAAC 2020)


Abstract
We investigate the List H-Coloring problem, the generalization of graph coloring that asks whether an input graph G admits a homomorphism to the undirected graph H (possibly with loops), such that each vertex v ∈ V(G) is mapped to a vertex on its list L(v) ⊆ V(H). An important result by Feder, Hell, and Huang [JGT 2003] states that List H-Coloring is polynomial-time solvable if H is a so-called bi-arc graph, and NP-complete otherwise. We investigate the NP-complete cases of the problem from the perspective of polynomial-time sparsification: can an n-vertex instance be efficiently reduced to an equivalent instance of bitsize 𝒪(n^(2-ε)) for some ε > 0? We prove that if H is not a bi-arc graph, then List H-Coloring does not admit such a sparsification algorithm unless NP ⊆ coNP/poly. Our proofs combine techniques from kernelization lower bounds with a study of the structure of graphs H which are not bi-arc graphs.

Cite as

Hubie Chen, Bart M. P. Jansen, Karolina Okrasa, Astrid Pieterse, and Paweł Rzążewski. Sparsification Lower Bounds for List H-Coloring. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 58:1-58:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ISAAC.2020.58,
  author =	{Chen, Hubie and Jansen, Bart M. P. and Okrasa, Karolina and Pieterse, Astrid and Rz\k{a}\.{z}ewski, Pawe{\l}},
  title =	{{Sparsification Lower Bounds for List H-Coloring}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{58:1--58:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.58},
  URN =		{urn:nbn:de:0030-drops-134027},
  doi =		{10.4230/LIPIcs.ISAAC.2020.58},
  annote =	{Keywords: List H-Coloring, Sparsification, Constraint Satisfaction Problem}
}
Document
Optimal Polynomial-Time Compression for Boolean Max CSP

Authors: Bart M. P. Jansen and Michał Włodarczyk

Published in: LIPIcs, Volume 173, 28th Annual European Symposium on Algorithms (ESA 2020)


Abstract
In the Boolean maximum constraint satisfaction problem - Max CSP(Γ) - one is given a collection of weighted applications of constraints from a finite constraint language Γ, over a common set of variables, and the goal is to assign Boolean values to the variables so that the total weight of satisfied constraints is maximized. There exists a concise dichotomy theorem providing a criterion on Γ for the problem to be polynomial-time solvable and stating that otherwise it becomes NP-hard. We study the NP-hard cases through the lens of kernelization and provide a complete characterization of Max CSP(Γ) with respect to the optimal compression size. Namely, we prove that Max CSP(Γ) parameterized by the number of variables n is either polynomial-time solvable, or there exists an integer d ≥ 2 depending on Γ, such that: 1) An instance of Max CSP(Γ) can be compressed into an equivalent instance with 𝒪(n^d log n) bits in polynomial time, 2) Max CSP(Γ) does not admit such a compression to 𝒪(n^{d-ε}) bits unless NP ⊆ co-NP / poly. Our reductions are based on interpreting constraints as multilinear polynomials combined with the framework of constraint implementations. As another application of our reductions, we reveal tight connections between optimal running times for solving Max CSP(Γ). More precisely, we show that obtaining a running time of the form 𝒪(2^{(1-ε)n}) for particular classes of Max CSPs is as hard as breaching this barrier for Max d-SAT for some d.

Cite as

Bart M. P. Jansen and Michał Włodarczyk. Optimal Polynomial-Time Compression for Boolean Max CSP. In 28th Annual European Symposium on Algorithms (ESA 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 173, pp. 63:1-63:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.ESA.2020.63,
  author =	{Jansen, Bart M. P. and W{\l}odarczyk, Micha{\l}},
  title =	{{Optimal Polynomial-Time Compression for Boolean Max CSP}},
  booktitle =	{28th Annual European Symposium on Algorithms (ESA 2020)},
  pages =	{63:1--63:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-162-7},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{173},
  editor =	{Grandoni, Fabrizio and Herman, Grzegorz and Sanders, Peter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2020.63},
  URN =		{urn:nbn:de:0030-drops-129297},
  doi =		{10.4230/LIPIcs.ESA.2020.63},
  annote =	{Keywords: constraint satisfaction problem, kernelization, exponential time algorithms}
}
Document
Track A: Algorithms, Complexity and Games
Bridge-Depth Characterizes Which Structural Parameterizations of Vertex Cover Admit a Polynomial Kernel

Authors: Marin Bougeret, Bart M. P. Jansen, and Ignasi Sau

Published in: LIPIcs, Volume 168, 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)


Abstract
We study the kernelization complexity of structural parameterizations of the Vertex Cover problem. Here, the goal is to find a polynomial-time preprocessing algorithm that can reduce any instance (G,k) of the Vertex Cover problem to an equivalent one, whose size is polynomial in the size of a pre-determined complexity parameter of G. A long line of previous research deals with parameterizations based on the number of vertex deletions needed to reduce G to a member of a simple graph class ℱ, such as forests, graphs of bounded tree-depth, and graphs of maximum degree two. We set out to find the most general graph classes ℱ for which Vertex Cover parameterized by the vertex-deletion distance of the input graph to ℱ, admits a polynomial kernelization. We give a complete characterization of the minor-closed graph families ℱ for which such a kernelization exists. We introduce a new graph parameter called bridge-depth, and prove that a polynomial kernelization exists if and only if ℱ has bounded bridge-depth. The proof is based on an interesting connection between bridge-depth and the size of minimal blocking sets in graphs, which are vertex sets whose removal decreases the independence number.

Cite as

Marin Bougeret, Bart M. P. Jansen, and Ignasi Sau. Bridge-Depth Characterizes Which Structural Parameterizations of Vertex Cover Admit a Polynomial Kernel. In 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 168, pp. 16:1-16:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bougeret_et_al:LIPIcs.ICALP.2020.16,
  author =	{Bougeret, Marin and Jansen, Bart M. P. and Sau, Ignasi},
  title =	{{Bridge-Depth Characterizes Which Structural Parameterizations of Vertex Cover Admit a Polynomial Kernel}},
  booktitle =	{47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)},
  pages =	{16:1--16:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-138-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{168},
  editor =	{Czumaj, Artur and Dawar, Anuj and Merelli, Emanuela},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2020.16},
  URN =		{urn:nbn:de:0030-drops-124238},
  doi =		{10.4230/LIPIcs.ICALP.2020.16},
  annote =	{Keywords: vertex cover, parameterized complexity, polynomial kernel, structural parameterization, bridge-depth}
}
Document
Preprocessing Vertex-Deletion Problems: Characterizing Graph Properties by Low-Rank Adjacencies

Authors: Bart M. P. Jansen and Jari J. H. de Kroon

Published in: LIPIcs, Volume 162, 17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020)


Abstract
We consider the Π-free Deletion problem parameterized by the size of a vertex cover, for a range of graph properties Π. Given an input graph G, this problem asks whether there is a subset of at most k vertices whose removal ensures the resulting graph does not contain a graph from Π as induced subgraph. Many vertex-deletion problems such as Perfect Deletion, Wheel-free Deletion, and Interval Deletion fit into this framework. We introduce the concept of characterizing a graph property Π by low-rank adjacencies, and use it as the cornerstone of a general kernelization theorem for Π-Free Deletion parameterized by the size of a vertex cover. The resulting framework captures problems such as AT-Free Deletion, Wheel-free Deletion, and Interval Deletion. Moreover, our new framework shows that the vertex-deletion problem to perfect graphs has a polynomial kernel when parameterized by vertex cover, thereby resolving an open question by Fomin et al. [JCSS 2014]. Our main technical contribution shows how linear-algebraic dependence of suitably defined vectors over 𝔽₂ implies graph-theoretic statements about the presence of forbidden induced subgraphs.

Cite as

Bart M. P. Jansen and Jari J. H. de Kroon. Preprocessing Vertex-Deletion Problems: Characterizing Graph Properties by Low-Rank Adjacencies. In 17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 162, pp. 27:1-27:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.SWAT.2020.27,
  author =	{Jansen, Bart M. P. and de Kroon, Jari J. H.},
  title =	{{Preprocessing Vertex-Deletion Problems: Characterizing Graph Properties by Low-Rank Adjacencies}},
  booktitle =	{17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020)},
  pages =	{27:1--27:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-150-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{162},
  editor =	{Albers, Susanne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SWAT.2020.27},
  URN =		{urn:nbn:de:0030-drops-122748},
  doi =		{10.4230/LIPIcs.SWAT.2020.27},
  annote =	{Keywords: kernelization, vertex-deletion, graph modification, structural parameterization}
}
  • Refine by Author
  • 32 Jansen, Bart M. P.
  • 6 Pieterse, Astrid
  • 5 Włodarczyk, Michał
  • 4 de Kroon, Jari J. H.
  • 3 Bonnet, Édouard
  • Show More...

  • Refine by Classification
  • 29 Theory of computation → Parameterized complexity and exact algorithms
  • 19 Theory of computation → Graph algorithms analysis
  • 12 Mathematics of computing → Graph algorithms
  • 8 Theory of computation → Fixed parameter tractability
  • 7 Theory of computation → Problems, reductions and completeness
  • Show More...

  • Refine by Keyword
  • 14 kernelization
  • 6 fixed-parameter tractability
  • 6 parameterized complexity
  • 5 FPT
  • 5 lower bounds
  • Show More...

  • Refine by Type
  • 59 document
  • 1 volume

  • Refine by Publication Year
  • 32 2019
  • 5 2023
  • 4 2016
  • 4 2018
  • 4 2020
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail