11 Search Results for "Schepper, Philipp"


Document
List Homomorphisms by Deleting Edges and Vertices: Tight Complexity Bounds for Bounded-Treewidth Graphs

Authors: Barış Can Esmer, Jacob Focke, Dániel Marx, and Paweł Rzążewski

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
The goal of this paper is to investigate a family of optimization problems arising from list homomorphisms, and to understand what the best possible algorithms are if we restrict the problem to bounded-treewidth graphs. Given graphs G, H, and lists L(v) ⊆ V(H) for every v ∈ V(G), a list homomorphism from (G,L) to H is a function f:V(G) → V(H) that preserves the edges (i.e., uv ∈ E(G) implies f(u)f(v) ∈ E(H)) and respects the lists (i.e., f(v) ∈ L(v)). The graph H may have loops. For a fixed H, the input of the optimization problem LHomVD(H) is a graph G with lists L(v), and the task is to find a set X of vertices having minimum size such that (G-X,L) has a list homomorphism to H. We define analogously the edge-deletion variant LHomED(H), where we have to delete as few edges as possible from G to obtain a graph that has a list homomorphism. This expressive family of problems includes members that are essentially equivalent to fundamental problems such as Vertex Cover, Max Cut, Odd Cycle Transversal, and Edge/Vertex Multiway Cut. For both variants, we first characterize those graphs H that make the problem polynomial-time solvable and show that the problem is NP-hard for every other fixed H. Second, as our main result, we determine for every graph H for which the problem is NP-hard, the smallest possible constant c_H such that the problem can be solved in time c^t_H⋅ n^{𝒪(1)} if a tree decomposition of G having width t is given in the input. Let i(H) be the maximum size of a set of vertices in H that have pairwise incomparable neighborhoods. For the vertex-deletion variant LHomVD(H), we show that the smallest possible constant is i(H)+1 for every H: - Given a tree decomposition of width t of G, LHomVD(H) can be solved in time (i(H)+1)^t⋅ n^{𝒪(1)}. - For any ε > 0 and H, an (i(H)+1-ε)^t⋅ n^{𝒪(1)} algorithm would violate the Strong Exponential-Time Hypothesis (SETH). The situation is more complex for the edge-deletion version. For every H, one can solve LHomED(H) in time i(H)^t⋅ n^{𝒪(1)} if a tree decomposition of width t is given. However, the existence of a specific type of decomposition of H shows that there are graphs H where LHomED(H) can be solved significantly more efficiently and the best possible constant can be arbitrarily smaller than i(H). Nevertheless, we determine this best possible constant and (assuming the SETH) prove tight bounds for every fixed H.

Cite as

Barış Can Esmer, Jacob Focke, Dániel Marx, and Paweł Rzążewski. List Homomorphisms by Deleting Edges and Vertices: Tight Complexity Bounds for Bounded-Treewidth Graphs. In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 39:1-39:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{canesmer_et_al:LIPIcs.ESA.2024.39,
  author =	{Can Esmer, Bar{\i}\c{s} and Focke, Jacob and Marx, D\'{a}niel and Rz\k{a}\.{z}ewski, Pawe{\l}},
  title =	{{List Homomorphisms by Deleting Edges and Vertices: Tight Complexity Bounds for Bounded-Treewidth Graphs}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{39:1--39:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.39},
  URN =		{urn:nbn:de:0030-drops-211103},
  doi =		{10.4230/LIPIcs.ESA.2024.39},
  annote =	{Keywords: Graph Homomorphism, List Homomorphism, Vertex Deletion, Edge Deletion, Multiway Cut, Parameterized Complexity, Tight Bounds, Treewidth, SETH}
}
Document
Hitting Meets Packing: How Hard Can It Be?

Authors: Jacob Focke, Fabian Frei, Shaohua Li, Dániel Marx, Philipp Schepper, Roohani Sharma, and Karol Węgrzycki

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
We study a general family of problems that form a common generalization of classic hitting (also referred to as covering or transversal) and packing problems. An instance of 𝒳-HitPack asks: Can removing k (deletable) vertices of a graph G prevent us from packing 𝓁 vertex-disjoint objects of type 𝒳? This problem captures a spectrum of problems with standard hitting and packing on opposite ends. Our main motivating question is whether the combination 𝒳-HitPack can be significantly harder than these two base problems. Already for one particular choice of 𝒳, this question can be posed for many different complexity notions, leading to a large, so-far unexplored domain at the intersection of the areas of hitting and packing problems. At a high level, we present two case studies: (1) 𝒳 being all cycles, and (2) 𝒳 being all copies of a fixed graph H. In each, we explore the classical complexity as well as the parameterized complexity with the natural parameters k+𝓁 and treewidth. We observe that the combined problem can be drastically harder than the base problems: for cycles or for H being a connected graph on at least 3 vertices, the problem is Σ₂^𝖯-complete and requires double-exponential dependence on the treewidth of the graph (assuming the Exponential-Time Hypothesis). In contrast, the combined problem admits qualitatively similar running times as the base problems in some cases, although significant novel ideas are required. For 𝒳 being all cycles, we establish a 2^{poly(k+𝓁)}⋅ n^{𝒪(1)} algorithm using an involved branching method, for example. Also, for 𝒳 being all edges (i.e., H = K₂; this combines Vertex Cover and Maximum Matching) the problem can be solved in time 2^{poly(tw)}⋅ n^{𝒪(1)} on graphs of treewidth tw. The key step enabling this running time relies on a combinatorial bound obtained from an algebraic (linear delta-matroid) representation of possible matchings.

Cite as

Jacob Focke, Fabian Frei, Shaohua Li, Dániel Marx, Philipp Schepper, Roohani Sharma, and Karol Węgrzycki. Hitting Meets Packing: How Hard Can It Be?. In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 55:1-55:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{focke_et_al:LIPIcs.ESA.2024.55,
  author =	{Focke, Jacob and Frei, Fabian and Li, Shaohua and Marx, D\'{a}niel and Schepper, Philipp and Sharma, Roohani and W\k{e}grzycki, Karol},
  title =	{{Hitting Meets Packing: How Hard Can It Be?}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{55:1--55:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.55},
  URN =		{urn:nbn:de:0030-drops-211261},
  doi =		{10.4230/LIPIcs.ESA.2024.55},
  annote =	{Keywords: Hitting, Packing, Covering, Parameterized Algorithms, Lower Bounds, Treewidth}
}
Document
Invited Talk
Fine-Grained Complexity of Program Analysis (Invited Talk)

Authors: Rupak Majumdar

Published in: LIPIcs, Volume 306, 49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024)


Abstract
There is a well-known "cubic bottleneck" in program analysis and language theory: many program analysis problems can be solved in time cubic in the size of the input but, despite years of effort, there are no known sub-cubic algorithms. For example, context-free reachability (whether there is a path in a labeled graph that is labeled with a word from a context-free language), the emptiness problem for pushdown automata, and the recognition problem for two-way nondeterministic pushdown automata all belong to the cubic class. We survey the status of these problems through the lens of fine-grained complexity. We study the related certification task: given an instance of any of these problems, are there small and efficiently checkable certificates for the existence and for the non-existence of a path? We show that, in both scenarios, there exist succinct certificates (O(n²) in the size of the problem) and these certificates can be checked in subcubic (matrix multiplication) time. Thus, all these problems lie in nondeterministic and co-nondeterministic subcubic time. We also study a hierarchy of program analysis problems above the cubic bottleneck. A representative problem here is the recognition problem for two-way nondeterministic pushdown automata with k heads. We show fine-grained hardness results for this hierarchy. We also discuss purely language-theoretic consequences of these results: for example, we obtain hardest languages accepted by two-way nondeterministic multihead pushdown automata, as well as separations between language classes. (Joint work with A. R. Balasubramanian, Dmitry Chistikov, and Philipp Schepper.)

Cite as

Rupak Majumdar. Fine-Grained Complexity of Program Analysis (Invited Talk). In 49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 306, p. 5:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{majumdar:LIPIcs.MFCS.2024.5,
  author =	{Majumdar, Rupak},
  title =	{{Fine-Grained Complexity of Program Analysis}},
  booktitle =	{49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024)},
  pages =	{5:1--5:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-335-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{306},
  editor =	{Kr\'{a}lovi\v{c}, Rastislav and Ku\v{c}era, Anton{\'\i}n},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2024.5},
  URN =		{urn:nbn:de:0030-drops-205619},
  doi =		{10.4230/LIPIcs.MFCS.2024.5},
  annote =	{Keywords: Fine-grained complexity, CFL reachability, 2NPDA recognition, PDA emptiness}
}
Document
Track A: Algorithms, Complexity and Games
A Tight Monte-Carlo Algorithm for Steiner Tree Parameterized by Clique-Width

Authors: Narek Bojikian and Stefan Kratsch

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
Given a graph G = (V,E), a set T ⊆ V, and an integer b, the Steiner Tree problem asks whether G has a connected subgraph H with at most b vertices that spans all of T. This work presents a 3^k⋅ n^𝒪(1) time one-sided Monte-Carlo algorithm for solving Steiner Tree when additionally a clique-expression of width k is provided. Known lower bounds for less expressive parameters imply that this dependence on the clique-width of G is optimal assuming the Strong Exponential-Time Hypothesis (SETH). Indeed our work establishes that the parameter dependence of Steiner Tree is the same for any graph parameter between cutwidth and clique-width, assuming SETH. Our work contributes to the program of determining the exact parameterized complexity of fundamental hard problems relative to structural graph parameters such as treewidth, which was initiated by Lokshtanov et al. [SODA 2011 & TALG 2018] and which by now has seen a plethora of results. Since the cut-and-count framework of Cygan et al. [FOCS 2011 & TALG 2022], connectivity problems have played a key role in this program as they pose many challenges for developing tight upper and lower bounds. Recently, Hegerfeld and Kratsch [ESA 2023] gave the first application of the cut-and-count technique to problems parameterized by clique-width and obtained tight bounds for Connected Dominating Set and Connected Vertex Cover, leaving open the complexity of other benchmark connectivity problems such as Steiner Tree and Feedback Vertex Set. Our algorithm for Steiner Tree does not follow the cut-and-count technique and instead works with the connectivity patterns of partial solutions. As a first technical contribution we identify a special family of so-called complete patterns that has strong (existential) representation properties, and using these at least one solution will be preserved. Furthermore, there is a family of 3^k basis patterns that (parity) represents the complete patterns, i.e., it has the same number of solutions modulo two. Our main technical contribution, a new technique called "isolating a representative," allows us to leverage both forms of representation (existential and parity). Both complete patterns and isolation of a representative will likely be applicable to other (connectivity) problems.

Cite as

Narek Bojikian and Stefan Kratsch. A Tight Monte-Carlo Algorithm for Steiner Tree Parameterized by Clique-Width. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 29:1-29:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{bojikian_et_al:LIPIcs.ICALP.2024.29,
  author =	{Bojikian, Narek and Kratsch, Stefan},
  title =	{{A Tight Monte-Carlo Algorithm for Steiner Tree Parameterized by Clique-Width}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{29:1--29:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.29},
  URN =		{urn:nbn:de:0030-drops-201728},
  doi =		{10.4230/LIPIcs.ICALP.2024.29},
  annote =	{Keywords: Parameterized complexity, Steiner tree, clique-width}
}
Document
Track A: Algorithms, Complexity and Games
Fundamental Problems on Bounded-Treewidth Graphs: The Real Source of Hardness

Authors: Barış Can Esmer, Jacob Focke, Dániel Marx, and Paweł Rzążewski

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
It is known for many algorithmic problems that if a tree decomposition of width t is given in the input, then the problem can be solved with exponential dependence on t. A line of research initiated by Lokshtanov, Marx, and Saurabh [SODA 2011] produced lower bounds showing that in many cases known algorithms already achieve the best possible exponential dependence on t, assuming the Strong Exponential-Time Hypothesis (SETH). The main message of this paper is showing that the same lower bounds can already be obtained in a much more restricted setting: informally, a graph consisting of a block of t vertices connected to components of constant size already has the same hardness as a general tree decomposition of width t. Formally, a (σ,δ)-hub is a set Q of vertices such that every component of Q has size at most σ and is adjacent to at most δ vertices of Q. We explore if the known tight lower bounds parameterized by the width of the given tree decomposition remain valid if we parameterize by the size of the given hub. - For every ε > 0, there are σ,δ > 0 such that Independent Set (equivalently Vertex Cover) cannot be solved in time (2-ε)^p⋅ n, even if a (σ, δ)-hub of size p is given in the input, assuming the SETH. This matches the earlier tight lower bounds parameterized by width of the tree decomposition. Similar tight bounds are obtained for Odd Cycle Transversal, Max Cut, q-Coloring, and edge/vertex deletions versions of q-Coloring. - For every ε > 0, there are σ,δ > 0 such that △-Partition cannot be solved in time (2-ε)^p ⋅ n, even if a (σ, δ)-hub of size p is given in the input, assuming the Set Cover Conjecture (SCC). In fact, we prove that this statement is equivalent to the SCC, thus it is unlikely that this could be proved assuming the SETH. - For Dominating Set, we can prove a non-tight lower bound ruling out (2-ε)^p ⋅ n^𝒪(1) algorithms, assuming either the SETH or the SCC, but this does not match the 3^p⋅ n^{𝒪(1)} upper bound. Thus our results reveal that, for many problems, the research on lower bounds on the dependence on tree width was never really about tree decompositions, but the real source of hardness comes from a much simpler structure. Additionally, we study if the same lower bounds can be obtained if σ and δ are fixed universal constants (not depending on ε). We show that lower bounds of this form are possible for Max Cut and the edge-deletion version of q-Coloring, under the Max 3-Sat Hypothesis (M3SH). However, no such lower bounds are possible for Independent Set, Odd Cycle Transversal, and the vertex-deletion version of q-Coloring: better than brute force algorithms are possible for every fixed (σ,δ).

Cite as

Barış Can Esmer, Jacob Focke, Dániel Marx, and Paweł Rzążewski. Fundamental Problems on Bounded-Treewidth Graphs: The Real Source of Hardness. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 34:1-34:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{canesmer_et_al:LIPIcs.ICALP.2024.34,
  author =	{Can Esmer, Bar{\i}\c{s} and Focke, Jacob and Marx, D\'{a}niel and Rz\k{a}\.{z}ewski, Pawe{\l}},
  title =	{{Fundamental Problems on Bounded-Treewidth Graphs: The Real Source of Hardness}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{34:1--34:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.34},
  URN =		{urn:nbn:de:0030-drops-201772},
  doi =		{10.4230/LIPIcs.ICALP.2024.34},
  annote =	{Keywords: Parameterized Complexity, Tight Bounds, Hub, Treewidth, Strong Exponential Time Hypothesis, Vertex Coloring, Vertex Deletion, Edge Deletion, Triangle Packing, Triangle Partition, Set Cover Hypothesis, Dominating Set}
}
Document
Track A: Algorithms, Complexity and Games
Towards Tight Bounds for the Graph Homomorphism Problem Parameterized by Cutwidth via Asymptotic Matrix Parameters

Authors: Carla Groenland, Isja Mannens, Jesper Nederlof, Marta Piecyk, and Paweł Rzążewski

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
A homomorphism from a graph G to a graph H is an edge-preserving mapping from V(G) to V(H). In the graph homomorphism problem, denoted by Hom(H), the graph H is fixed and we need to determine if there exists a homomorphism from an instance graph G to H. We study the complexity of the problem parameterized by the cutwidth of G, i.e., we assume that G is given along with a linear ordering v_1,…,v_n of V(G) such that, for each i ∈ {1,…,n-1}, the number of edges with one endpoint in {v_1,…,v_i} and the other in {v_{i+1},…,v_n} is at most k. We aim, for each H, for algorithms for Hom(H) running in time c_H^k n^𝒪(1) and matching lower bounds that exclude c_H^{k⋅o(1)} n^𝒪(1) or c_H^{k(1-Ω(1))} n^𝒪(1) time algorithms under the (Strong) Exponential Time Hypothesis. In the paper we introduce a new parameter that we call mimsup(H). Our main contribution is strong evidence of a close connection between c_H and mimsup(H): - an information-theoretic argument that the number of states needed in a natural dynamic programming algorithm is at most mimsup(H)^k, - lower bounds that show that for almost all graphs H indeed we have c_H ≥ mimsup(H), assuming the (Strong) Exponential-Time Hypothesis, and - an algorithm with running time exp(𝒪(mimsup(H)⋅k log k)) n^𝒪(1). In the last result we do not need to assume that H is a fixed graph. Thus, as a consequence, we obtain that the problem of deciding whether G admits a homomorphism to H is fixed-parameter tractable, when parameterized by cutwidth of G and mimsup(H). The parameter mimsup(H) can be thought of as the p-th root of the maximum induced matching number in the graph obtained by multiplying p copies of H via a certain graph product, where p tends to infinity. It can also be defined as an asymptotic rank parameter of the adjacency matrix of H. Such parameters play a central role in, among others, algebraic complexity theory and additive combinatorics. Our results tightly link the parameterized complexity of a problem to such an asymptotic matrix parameter for the first time.

Cite as

Carla Groenland, Isja Mannens, Jesper Nederlof, Marta Piecyk, and Paweł Rzążewski. Towards Tight Bounds for the Graph Homomorphism Problem Parameterized by Cutwidth via Asymptotic Matrix Parameters. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 77:1-77:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{groenland_et_al:LIPIcs.ICALP.2024.77,
  author =	{Groenland, Carla and Mannens, Isja and Nederlof, Jesper and Piecyk, Marta and Rz\k{a}\.{z}ewski, Pawe{\l}},
  title =	{{Towards Tight Bounds for the Graph Homomorphism Problem Parameterized by Cutwidth via Asymptotic Matrix Parameters}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{77:1--77:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.77},
  URN =		{urn:nbn:de:0030-drops-202208},
  doi =		{10.4230/LIPIcs.ICALP.2024.77},
  annote =	{Keywords: graph homomorphism, cutwidth, asymptotic matrix parameters}
}
Document
Computing Generalized Convolutions Faster Than Brute Force

Authors: Barış Can Esmer, Ariel Kulik, Dániel Marx, Philipp Schepper, and Karol Węgrzycki

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
In this paper, we consider a general notion of convolution. Let D be a finite domain and let Dⁿ be the set of n-length vectors (tuples) of D. Let f : D × D → D be a function and let ⊕_f be a coordinate-wise application of f. The f-Convolution of two functions g,h : Dⁿ → {-M,…,M} is (g ⊛_f h)(v) := ∑_{v_g,v_h ∈ D^n s.t. v = v_g ⊕_f v_h} g(v_g) ⋅ h(v_h) for every 𝐯 ∈ Dⁿ. This problem generalizes many fundamental convolutions such as Subset Convolution, XOR Product, Covering Product or Packing Product, etc. For arbitrary function f and domain D we can compute f-Convolution via brute-force enumeration in 𝒪̃(|D|^{2n} ⋅ polylog(M)) time. Our main result is an improvement over this naive algorithm. We show that f-Convolution can be computed exactly in 𝒪̃((c ⋅ |D|²)ⁿ ⋅ polylog(M)) for constant c := 5/6 when D has even cardinality. Our main observation is that a cyclic partition of a function f : D × D → D can be used to speed up the computation of f-Convolution, and we show that an appropriate cyclic partition exists for every f. Furthermore, we demonstrate that a single entry of the f-Convolution can be computed more efficiently. In this variant, we are given two functions g,h : Dⁿ → {-M,…,M} alongside with a vector 𝐯 ∈ Dⁿ and the task of the f-Query problem is to compute integer (g ⊛_f h)(𝐯). This is a generalization of the well-known Orthogonal Vectors problem. We show that f-Query can be computed in 𝒪̃(|D|^{(ω/2)n} ⋅ polylog(M)) time, where ω ∈ [2,2.373) is the exponent of currently fastest matrix multiplication algorithm.

Cite as

Barış Can Esmer, Ariel Kulik, Dániel Marx, Philipp Schepper, and Karol Węgrzycki. Computing Generalized Convolutions Faster Than Brute Force. In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 12:1-12:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{esmer_et_al:LIPIcs.IPEC.2022.12,
  author =	{Esmer, Bar{\i}\c{s} Can and Kulik, Ariel and Marx, D\'{a}niel and Schepper, Philipp and W\k{e}grzycki, Karol},
  title =	{{Computing Generalized Convolutions Faster Than Brute Force}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{12:1--12:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.12},
  URN =		{urn:nbn:de:0030-drops-173685},
  doi =		{10.4230/LIPIcs.IPEC.2022.12},
  annote =	{Keywords: Generalized Convolution, Fast Fourier Transform, Fast Subset Convolution}
}
Document
Domination and Cut Problems on Chordal Graphs with Bounded Leafage

Authors: Esther Galby, Dániel Marx, Philipp Schepper, Roohani Sharma, and Prafullkumar Tale

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
The leafage of a chordal graph G is the minimum integer 𝓁 such that G can be realized as an intersection graph of subtrees of a tree with 𝓁 leaves. We consider structural parameterization by the leafage of classical domination and cut problems on chordal graphs. Fomin, Golovach, and Raymond [ESA 2018, Algorithmica 2020] proved, among other things, that Dominating Set on chordal graphs admits an algorithm running in time 2^𝒪(𝓁²) ⋅ n^𝒪(1). We present a conceptually much simpler algorithm that runs in time 2^𝒪(𝓁) ⋅ n^𝒪(1). We extend our approach to obtain similar results for Connected Dominating Set and Steiner Tree. We then consider the two classical cut problems MultiCut with Undeletable Terminals and Multiway Cut with Undeletable Terminals. We prove that the former is W[1]-hard when parameterized by the leafage and complement this result by presenting a simple n^𝒪(𝓁)-time algorithm. To our surprise, we find that Multiway Cut with Undeletable Terminals on chordal graphs can be solved, in contrast, in n^O(1)-time.

Cite as

Esther Galby, Dániel Marx, Philipp Schepper, Roohani Sharma, and Prafullkumar Tale. Domination and Cut Problems on Chordal Graphs with Bounded Leafage. In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 14:1-14:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{galby_et_al:LIPIcs.IPEC.2022.14,
  author =	{Galby, Esther and Marx, D\'{a}niel and Schepper, Philipp and Sharma, Roohani and Tale, Prafullkumar},
  title =	{{Domination and Cut Problems on Chordal Graphs with Bounded Leafage}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{14:1--14:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.14},
  URN =		{urn:nbn:de:0030-drops-173704},
  doi =		{10.4230/LIPIcs.IPEC.2022.14},
  annote =	{Keywords: Chordal Graphs, Leafage, FPT Algorithms, Dominating Set, MultiCut with Undeletable Terminals, Multiway Cut with Undeletable Terminals}
}
Document
Anti-Factor Is FPT Parameterized by Treewidth and List Size (But Counting Is Hard)

Authors: Dániel Marx, Govind S. Sankar, and Philipp Schepper

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
In the general AntiFactor problem, a graph G and, for every vertex v of G, a set X_v ⊆ ℕ of forbidden degrees is given. The task is to find a set S of edges such that the degree of v in S is not in the set X_v. Standard techniques (dynamic programming plus fast convolution) can be used to show that if M is the largest forbidden degree, then the problem can be solved in time (M+2)^{tw}⋅n^{O(1)} if a tree decomposition of width tw is given. However, significantly faster algorithms are possible if the sets X_v are sparse: our main algorithmic result shows that if every vertex has at most x forbidden degrees (we call this special case AntiFactor_x), then the problem can be solved in time (x+1)^{O(tw)}⋅n^{O(1)}. That is, AntiFactor_x is fixed-parameter tractable parameterized by treewidth tw and the maximum number x of excluded degrees. Our algorithm uses the technique of representative sets, which can be generalized to the optimization version, but (as expected) not to the counting version of the problem. In fact, we show that #AntiFactor₁ is already #W[1]-hard parameterized by the width of the given decomposition. Moreover, we show that, unlike for the decision version, the standard dynamic programming algorithm is essentially optimal for the counting version. Formally, for a fixed nonempty set X, we denote by X-AntiFactor the special case where every vertex v has the same set X_v = X of forbidden degrees. We show the following lower bound for every fixed set X: if there is an ε > 0 such that #X-AntiFactor can be solved in time (max X+2-ε)^{tw}⋅n^{O(1)} given a tree decomposition of width tw, then the Counting Strong Exponential-Time Hypothesis (#SETH) fails.

Cite as

Dániel Marx, Govind S. Sankar, and Philipp Schepper. Anti-Factor Is FPT Parameterized by Treewidth and List Size (But Counting Is Hard). In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 22:1-22:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.IPEC.2022.22,
  author =	{Marx, D\'{a}niel and Sankar, Govind S. and Schepper, Philipp},
  title =	{{Anti-Factor Is FPT Parameterized by Treewidth and List Size (But Counting Is Hard)}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{22:1--22:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.22},
  URN =		{urn:nbn:de:0030-drops-173780},
  doi =		{10.4230/LIPIcs.IPEC.2022.22},
  annote =	{Keywords: Anti-Factor, General Factor, Treewidth, Representative Sets, SETH}
}
Document
Track A: Algorithms, Complexity and Games
Degrees and Gaps: Tight Complexity Results of General Factor Problems Parameterized by Treewidth and Cutwidth

Authors: Dániel Marx, Govind S. Sankar, and Philipp Schepper

Published in: LIPIcs, Volume 198, 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)


Abstract
In the General Factor problem, we are given an undirected graph G and for each vertex v ∈ V(G) a finite set B_v of non-negative integers. The task is to decide if there is a subset S ⊆ E(G) such that deg_S(v) ∈ B_v for all vertices v of G. Define the max-gap of a finite integer set B to be the largest d ≥ 0 such that there is an a ≥ 0 with [a,a+d+1] ∩ B = {a,a+d+1}. Cornuéjols showed in 1988 that if the max-gap of all sets B_v is at most 1, then the decision version of General Factor is polynomial-time solvable. This result was extended 2018 by Dudycz and Paluch for the optimization (i.e. minimization and maximization) versions. We present a general algorithm counting the number of solutions of a certain size in time #2 (M+1)^{tw}^{𝒪(1)}, given a tree decomposition of width tw, where M is the maximum integer over all B_v. By using convolution techniques from van Rooij (2020), we improve upon the previous (M+1)^{3tw}^𝒪(1) time algorithm by Arulselvan et al. from 2018. We prove that this algorithm is essentially optimal for all cases that are not trivial or polynomial time solvable for the decision, minimization or maximization versions. Our lower bounds show that such an improvement is not even possible for B-Factor, which is General Factor on graphs where all sets B_v agree with the fixed set B. We show that for every fixed B where the problem is NP-hard, our (max B+1)^tw^𝒪(1) algorithm cannot be significantly improved: assuming the Strong Exponential Time Hypothesis (SETH), no algorithm can solve B-Factor in time (max B+1-ε)^tw^𝒪(1) for any ε > 0. We extend this bound to the counting version of B-Factor for arbitrary, non-trivial sets B, assuming #SETH. We also investigate the parameterization of the problem by cutwidth. Unlike for treewidth, having a larger set B does not appear to make the problem harder: we give a 2^cutw^𝒪(1) algorithm for any B and provide a matching lower bound that this is optimal for the NP-hard cases.

Cite as

Dániel Marx, Govind S. Sankar, and Philipp Schepper. Degrees and Gaps: Tight Complexity Results of General Factor Problems Parameterized by Treewidth and Cutwidth. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 95:1-95:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.ICALP.2021.95,
  author =	{Marx, D\'{a}niel and Sankar, Govind S. and Schepper, Philipp},
  title =	{{Degrees and Gaps: Tight Complexity Results of General Factor Problems Parameterized by Treewidth and Cutwidth}},
  booktitle =	{48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)},
  pages =	{95:1--95:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-195-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{198},
  editor =	{Bansal, Nikhil and Merelli, Emanuela and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2021.95},
  URN =		{urn:nbn:de:0030-drops-141647},
  doi =		{10.4230/LIPIcs.ICALP.2021.95},
  annote =	{Keywords: General Factor, General Matching, Treewidth, Cutwidth}
}
Document
Fine-Grained Complexity of Regular Expression Pattern Matching and Membership

Authors: Philipp Schepper

Published in: LIPIcs, Volume 173, 28th Annual European Symposium on Algorithms (ESA 2020)


Abstract
The currently fastest algorithm for regular expression pattern matching and membership improves the classical O(nm) time algorithm by a factor of about log^{3/2}n. Instead of focussing on general patterns we analyse homogeneous patterns of bounded depth in this work. For them a classification splitting the types in easy (strongly sub-quadratic) and hard (essentially quadratic time under SETH) is known. We take a very fine-grained look at the hard pattern types from this classification and show a dichotomy: few types allow super-poly-logarithmic improvements while the algorithms for the other pattern types can only be improved by a constant number of log-factors, assuming the Formula-SAT Hypothesis.

Cite as

Philipp Schepper. Fine-Grained Complexity of Regular Expression Pattern Matching and Membership. In 28th Annual European Symposium on Algorithms (ESA 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 173, pp. 80:1-80:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{schepper:LIPIcs.ESA.2020.80,
  author =	{Schepper, Philipp},
  title =	{{Fine-Grained Complexity of Regular Expression Pattern Matching and Membership}},
  booktitle =	{28th Annual European Symposium on Algorithms (ESA 2020)},
  pages =	{80:1--80:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-162-7},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{173},
  editor =	{Grandoni, Fabrizio and Herman, Grzegorz and Sanders, Peter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2020.80},
  URN =		{urn:nbn:de:0030-drops-129464},
  doi =		{10.4230/LIPIcs.ESA.2020.80},
  annote =	{Keywords: Fine-Grained Complexity, Regular Expression, Pattern Matching, Dichotomy}
}
  • Refine by Author
  • 7 Marx, Dániel
  • 6 Schepper, Philipp
  • 3 Focke, Jacob
  • 3 Rzążewski, Paweł
  • 2 Can Esmer, Barış
  • Show More...

  • Refine by Classification
  • 7 Theory of computation → Parameterized complexity and exact algorithms
  • 2 Theory of computation → Algorithm design techniques
  • 1 Mathematics of computing → Graph algorithms
  • 1 Software and its engineering → Compilers
  • 1 Theory of computation → Fixed parameter tractability
  • Show More...

  • Refine by Keyword
  • 5 Treewidth
  • 2 Dominating Set
  • 2 Edge Deletion
  • 2 General Factor
  • 2 Parameterized Complexity
  • Show More...

  • Refine by Type
  • 11 document

  • Refine by Publication Year
  • 6 2024
  • 3 2022
  • 1 2020
  • 1 2021

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail