70 Search Results for "Marx, Daniel"


Volume

LIPIcs, Volume 107

45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)

ICALP 2018, July 9-13, 2018, Prague, Czech Republic

Editors: Ioannis Chatzigiannakis, Christos Kaklamanis, Dániel Marx, and Donald Sannella

Document
Approximate Monotone Local Search for Weighted Problems

Authors: Barış Can Esmer, Ariel Kulik, Dániel Marx, Daniel Neuen, and Roohani Sharma

Published in: LIPIcs, Volume 285, 18th International Symposium on Parameterized and Exact Computation (IPEC 2023)


Abstract
In a recent work, Esmer et al. describe a simple method - Approximate Monotone Local Search - to obtain exponential approximation algorithms from existing parameterized exact algorithms, polynomial-time approximation algorithms and, more generally, parameterized approximation algorithms. In this work, we generalize those results to the weighted setting. More formally, we consider monotone subset minimization problems over a weighted universe of size n (e.g., Vertex Cover, d-Hitting Set and Feedback Vertex Set). We consider a model where the algorithm is only given access to a subroutine that finds a solution of weight at most α ⋅ W (and of arbitrary cardinality) in time c^k ⋅ n^{𝒪(1)} where W is the minimum weight of a solution of cardinality at most k. In the unweighted setting, Esmer et al. determine the smallest value d for which a β-approximation algorithm running in time dⁿ ⋅ n^{𝒪(1)} can be obtained in this model. We show that the same dependencies also hold in a weighted setting in this model: for every fixed ε > 0 we obtain a β-approximation algorithm running in time 𝒪((d+ε)ⁿ), for the same d as in the unweighted setting. Similarly, we also extend a β-approximate brute-force search (in a model which only provides access to a membership oracle) to the weighted setting. Using existing approximation algorithms and exact parameterized algorithms for weighted problems, we obtain the first exponential-time β-approximation algorithms that are better than brute force for a variety of problems including Weighted Vertex Cover, Weighted d-Hitting Set, Weighted Feedback Vertex Set and Weighted Multicut.

Cite as

Barış Can Esmer, Ariel Kulik, Dániel Marx, Daniel Neuen, and Roohani Sharma. Approximate Monotone Local Search for Weighted Problems. In 18th International Symposium on Parameterized and Exact Computation (IPEC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 285, pp. 17:1-17:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{esmer_et_al:LIPIcs.IPEC.2023.17,
  author =	{Esmer, Bar{\i}\c{s} Can and Kulik, Ariel and Marx, D\'{a}niel and Neuen, Daniel and Sharma, Roohani},
  title =	{{Approximate Monotone Local Search for Weighted Problems}},
  booktitle =	{18th International Symposium on Parameterized and Exact Computation (IPEC 2023)},
  pages =	{17:1--17:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-305-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{285},
  editor =	{Misra, Neeldhara and Wahlstr\"{o}m, Magnus},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2023.17},
  URN =		{urn:nbn:de:0030-drops-194360},
  doi =		{10.4230/LIPIcs.IPEC.2023.17},
  annote =	{Keywords: parameterized approximations, exponential approximations, monotone local search}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
The Fine-Grained Complexity of Boolean Conjunctive Queries and Sum-Product Problems

Authors: Austen Z. Fan, Paraschos Koutris, and Hangdong Zhao

Published in: LIPIcs, Volume 261, 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023)


Abstract
We study the fine-grained complexity of evaluating Boolean Conjunctive Queries and their generalization to sum-of-product problems over an arbitrary semiring. For these problems, we present a general semiring-oblivious reduction from the k-clique problem to any query structure (hypergraph). Our reduction uses the notion of embedding a graph to a hypergraph, first introduced by Marx [Dániel Marx, 2013]. As a consequence of our reduction, we can show tight conditional lower bounds for many classes of hypergraphs, including cycles, Loomis-Whitney joins, some bipartite graphs, and chordal graphs. These lower bounds have a dependence on what we call the clique embedding power of a hypergraph H, which we believe is a quantity of independent interest. We show that the clique embedding power is always less than the submodular width of the hypergraph, and present a decidable algorithm for computing it. We conclude with many open problems for future research.

Cite as

Austen Z. Fan, Paraschos Koutris, and Hangdong Zhao. The Fine-Grained Complexity of Boolean Conjunctive Queries and Sum-Product Problems. In 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 261, pp. 127:1-127:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{fan_et_al:LIPIcs.ICALP.2023.127,
  author =	{Fan, Austen Z. and Koutris, Paraschos and Zhao, Hangdong},
  title =	{{The Fine-Grained Complexity of Boolean Conjunctive Queries and Sum-Product Problems}},
  booktitle =	{50th International Colloquium on Automata, Languages, and Programming (ICALP 2023)},
  pages =	{127:1--127:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-278-5},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{261},
  editor =	{Etessami, Kousha and Feige, Uriel and Puppis, Gabriele},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2023.127},
  URN =		{urn:nbn:de:0030-drops-181791},
  doi =		{10.4230/LIPIcs.ICALP.2023.127},
  annote =	{Keywords: Fine-grained complexity, conjunctive queries, semiring-oblivious reduction}
}
Document
Computing Generalized Convolutions Faster Than Brute Force

Authors: Barış Can Esmer, Ariel Kulik, Dániel Marx, Philipp Schepper, and Karol Węgrzycki

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
In this paper, we consider a general notion of convolution. Let D be a finite domain and let Dⁿ be the set of n-length vectors (tuples) of D. Let f : D × D → D be a function and let ⊕_f be a coordinate-wise application of f. The f-Convolution of two functions g,h : Dⁿ → {-M,…,M} is (g ⊛_f h)(v) := ∑_{v_g,v_h ∈ D^n s.t. v = v_g ⊕_f v_h} g(v_g) ⋅ h(v_h) for every 𝐯 ∈ Dⁿ. This problem generalizes many fundamental convolutions such as Subset Convolution, XOR Product, Covering Product or Packing Product, etc. For arbitrary function f and domain D we can compute f-Convolution via brute-force enumeration in 𝒪̃(|D|^{2n} ⋅ polylog(M)) time. Our main result is an improvement over this naive algorithm. We show that f-Convolution can be computed exactly in 𝒪̃((c ⋅ |D|²)ⁿ ⋅ polylog(M)) for constant c := 5/6 when D has even cardinality. Our main observation is that a cyclic partition of a function f : D × D → D can be used to speed up the computation of f-Convolution, and we show that an appropriate cyclic partition exists for every f. Furthermore, we demonstrate that a single entry of the f-Convolution can be computed more efficiently. In this variant, we are given two functions g,h : Dⁿ → {-M,…,M} alongside with a vector 𝐯 ∈ Dⁿ and the task of the f-Query problem is to compute integer (g ⊛_f h)(𝐯). This is a generalization of the well-known Orthogonal Vectors problem. We show that f-Query can be computed in 𝒪̃(|D|^{(ω/2)n} ⋅ polylog(M)) time, where ω ∈ [2,2.373) is the exponent of currently fastest matrix multiplication algorithm.

Cite as

Barış Can Esmer, Ariel Kulik, Dániel Marx, Philipp Schepper, and Karol Węgrzycki. Computing Generalized Convolutions Faster Than Brute Force. In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 12:1-12:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{esmer_et_al:LIPIcs.IPEC.2022.12,
  author =	{Esmer, Bar{\i}\c{s} Can and Kulik, Ariel and Marx, D\'{a}niel and Schepper, Philipp and W\k{e}grzycki, Karol},
  title =	{{Computing Generalized Convolutions Faster Than Brute Force}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{12:1--12:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.12},
  URN =		{urn:nbn:de:0030-drops-173685},
  doi =		{10.4230/LIPIcs.IPEC.2022.12},
  annote =	{Keywords: Generalized Convolution, Fast Fourier Transform, Fast Subset Convolution}
}
Document
Domination and Cut Problems on Chordal Graphs with Bounded Leafage

Authors: Esther Galby, Dániel Marx, Philipp Schepper, Roohani Sharma, and Prafullkumar Tale

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
The leafage of a chordal graph G is the minimum integer 𝓁 such that G can be realized as an intersection graph of subtrees of a tree with 𝓁 leaves. We consider structural parameterization by the leafage of classical domination and cut problems on chordal graphs. Fomin, Golovach, and Raymond [ESA 2018, Algorithmica 2020] proved, among other things, that Dominating Set on chordal graphs admits an algorithm running in time 2^𝒪(𝓁²) ⋅ n^𝒪(1). We present a conceptually much simpler algorithm that runs in time 2^𝒪(𝓁) ⋅ n^𝒪(1). We extend our approach to obtain similar results for Connected Dominating Set and Steiner Tree. We then consider the two classical cut problems MultiCut with Undeletable Terminals and Multiway Cut with Undeletable Terminals. We prove that the former is W[1]-hard when parameterized by the leafage and complement this result by presenting a simple n^𝒪(𝓁)-time algorithm. To our surprise, we find that Multiway Cut with Undeletable Terminals on chordal graphs can be solved, in contrast, in n^O(1)-time.

Cite as

Esther Galby, Dániel Marx, Philipp Schepper, Roohani Sharma, and Prafullkumar Tale. Domination and Cut Problems on Chordal Graphs with Bounded Leafage. In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 14:1-14:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{galby_et_al:LIPIcs.IPEC.2022.14,
  author =	{Galby, Esther and Marx, D\'{a}niel and Schepper, Philipp and Sharma, Roohani and Tale, Prafullkumar},
  title =	{{Domination and Cut Problems on Chordal Graphs with Bounded Leafage}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{14:1--14:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.14},
  URN =		{urn:nbn:de:0030-drops-173704},
  doi =		{10.4230/LIPIcs.IPEC.2022.14},
  annote =	{Keywords: Chordal Graphs, Leafage, FPT Algorithms, Dominating Set, MultiCut with Undeletable Terminals, Multiway Cut with Undeletable Terminals}
}
Document
Anti-Factor Is FPT Parameterized by Treewidth and List Size (But Counting Is Hard)

Authors: Dániel Marx, Govind S. Sankar, and Philipp Schepper

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
In the general AntiFactor problem, a graph G and, for every vertex v of G, a set X_v ⊆ ℕ of forbidden degrees is given. The task is to find a set S of edges such that the degree of v in S is not in the set X_v. Standard techniques (dynamic programming plus fast convolution) can be used to show that if M is the largest forbidden degree, then the problem can be solved in time (M+2)^{tw}⋅n^{O(1)} if a tree decomposition of width tw is given. However, significantly faster algorithms are possible if the sets X_v are sparse: our main algorithmic result shows that if every vertex has at most x forbidden degrees (we call this special case AntiFactor_x), then the problem can be solved in time (x+1)^{O(tw)}⋅n^{O(1)}. That is, AntiFactor_x is fixed-parameter tractable parameterized by treewidth tw and the maximum number x of excluded degrees. Our algorithm uses the technique of representative sets, which can be generalized to the optimization version, but (as expected) not to the counting version of the problem. In fact, we show that #AntiFactor₁ is already #W[1]-hard parameterized by the width of the given decomposition. Moreover, we show that, unlike for the decision version, the standard dynamic programming algorithm is essentially optimal for the counting version. Formally, for a fixed nonempty set X, we denote by X-AntiFactor the special case where every vertex v has the same set X_v = X of forbidden degrees. We show the following lower bound for every fixed set X: if there is an ε > 0 such that #X-AntiFactor can be solved in time (max X+2-ε)^{tw}⋅n^{O(1)} given a tree decomposition of width tw, then the Counting Strong Exponential-Time Hypothesis (#SETH) fails.

Cite as

Dániel Marx, Govind S. Sankar, and Philipp Schepper. Anti-Factor Is FPT Parameterized by Treewidth and List Size (But Counting Is Hard). In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 22:1-22:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.IPEC.2022.22,
  author =	{Marx, D\'{a}niel and Sankar, Govind S. and Schepper, Philipp},
  title =	{{Anti-Factor Is FPT Parameterized by Treewidth and List Size (But Counting Is Hard)}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{22:1--22:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.22},
  URN =		{urn:nbn:de:0030-drops-173780},
  doi =		{10.4230/LIPIcs.IPEC.2022.22},
  annote =	{Keywords: Anti-Factor, General Factor, Treewidth, Representative Sets, SETH}
}
Document
The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 22201)

Authors: Martin Grohe, Venkatesan Guruswami, Dániel Marx, and Stanislav Živný

Published in: Dagstuhl Reports, Volume 12, Issue 5 (2022)


Abstract
Constraint satisfaction has always played a central role in computational complexity theory; appropriate versions of CSPs are classical complete problems for most standard complexity classes. CSPs constitute a very rich and yet sufficiently manageable class of problems to give a good perspective on general computational phenomena. For instance, they help to understand which mathematical properties make a computational problem tractable (in a wide sense, e.g., polynomial-time solvable, non-trivially approximable, fixed-parameter tractable, or definable in a weak logic). In the last 15 years, research activity in this area has significantly intensified and hugely impressive progress was made. The Dagstuhl Seminar 22201 "The Constraint Satisfaction Problem: Complexity and Approximability" was aimed at bringing together researchers using all the different techniques in the study of the CSP so that they can share their insights obtained during the past four years. This report documents the material presented during the course of the seminar.

Cite as

Martin Grohe, Venkatesan Guruswami, Dániel Marx, and Stanislav Živný. The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 22201). In Dagstuhl Reports, Volume 12, Issue 5, pp. 112-130, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@Article{grohe_et_al:DagRep.12.5.112,
  author =	{Grohe, Martin and Guruswami, Venkatesan and Marx, D\'{a}niel and \v{Z}ivn\'{y}, Stanislav},
  title =	{{The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 22201)}},
  pages =	{112--130},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2022},
  volume =	{12},
  number =	{5},
  editor =	{Grohe, Martin and Guruswami, Venkatesan and Marx, D\'{a}niel and \v{Z}ivn\'{y}, Stanislav},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.12.5.112},
  URN =		{urn:nbn:de:0030-drops-174453},
  doi =		{10.4230/DagRep.12.5.112},
  annote =	{Keywords: Constraint satisfaction problem (CSP); Computational complexity; Hardness of approximation; Universal algebra; Semidefinite programming}
}
Document
Faster Exponential-Time Approximation Algorithms Using Approximate Monotone Local Search

Authors: Barış Can Esmer, Ariel Kulik, Dániel Marx, Daniel Neuen, and Roohani Sharma

Published in: LIPIcs, Volume 244, 30th Annual European Symposium on Algorithms (ESA 2022)


Abstract
We generalize the monotone local search approach of Fomin, Gaspers, Lokshtanov and Saurabh [J.ACM 2019], by establishing a connection between parameterized approximation and exponential-time approximation algorithms for monotone subset minimization problems. In a monotone subset minimization problem the input implicitly describes a non-empty set family over a universe of size n which is closed under taking supersets. The task is to find a minimum cardinality set in this family. Broadly speaking, we use approximate monotone local search to show that a parameterized α-approximation algorithm that runs in c^k⋅n^𝒪(1) time, where k is the solution size, can be used to derive an α-approximation randomized algorithm that runs in dⁿ⋅n^𝒪(1) time, where d is the unique value in (1, 1+{c-1}/α) such that 𝒟(1/α‖{d-1}/{c-1}) = {ln c}/α and 𝒟(a‖b) is the Kullback-Leibler divergence. This running time matches that of Fomin et al. for α = 1, and is strictly better when α > 1, for any c > 1. Furthermore, we also show that this result can be derandomized at the expense of a sub-exponential multiplicative factor in the running time. We use an approximate variant of the exhaustive search as a benchmark for our algorithm. We show that the classic 2ⁿ⋅n^𝒪(1) exhaustive search can be adapted to an α-approximate exhaustive search that runs in time (1+exp(-α⋅ℋ(1/(α))))ⁿ⋅n^𝒪(1), where ℋ is the entropy function. Furthermore, we provide a lower bound stating that the running time of this α-approximate exhaustive search is the best achievable running time in an oracle model. When compared to approximate exhaustive search, and to other techniques, the running times obtained by approximate monotone local search are strictly better for any α ≥ 1, c > 1. We demonstrate the potential of approximate monotone local search by deriving new and faster exponential approximation algorithms for Vertex Cover, 3-Hitting Set, Directed Feedback Vertex Set, Directed Subset Feedback Vertex Set, Directed Odd Cycle Transversal and Undirected Multicut. For instance, we get a 1.1-approximation algorithm for Vertex Cover with running time 1.114ⁿ⋅n^𝒪(1), improving upon the previously best known 1.1-approximation running in time 1.127ⁿ⋅n^𝒪(1) by Bourgeois et al. [DAM 2011].

Cite as

Barış Can Esmer, Ariel Kulik, Dániel Marx, Daniel Neuen, and Roohani Sharma. Faster Exponential-Time Approximation Algorithms Using Approximate Monotone Local Search. In 30th Annual European Symposium on Algorithms (ESA 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 244, pp. 50:1-50:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{esmer_et_al:LIPIcs.ESA.2022.50,
  author =	{Esmer, Bar{\i}\c{s} Can and Kulik, Ariel and Marx, D\'{a}niel and Neuen, Daniel and Sharma, Roohani},
  title =	{{Faster Exponential-Time Approximation Algorithms Using Approximate Monotone Local Search}},
  booktitle =	{30th Annual European Symposium on Algorithms (ESA 2022)},
  pages =	{50:1--50:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-247-1},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{244},
  editor =	{Chechik, Shiri and Navarro, Gonzalo and Rotenberg, Eva and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2022.50},
  URN =		{urn:nbn:de:0030-drops-169887},
  doi =		{10.4230/LIPIcs.ESA.2022.50},
  annote =	{Keywords: parameterized approximations, exponential approximations, monotone local search}
}
Document
True Contraction Decomposition and Almost ETH-Tight Bipartization for Unit-Disk Graphs

Authors: Sayan Bandyapadhyay, William Lochet, Daniel Lokshtanov, Saket Saurabh, and Jie Xue

Published in: LIPIcs, Volume 224, 38th International Symposium on Computational Geometry (SoCG 2022)


Abstract
We prove a structural theorem for unit-disk graphs, which (roughly) states that given a set 𝒟 of n unit disks inducing a unit-disk graph G_𝒟 and a number p ∈ [n], one can partition 𝒟 into p subsets 𝒟₁,… ,𝒟_p such that for every i ∈ [p] and every 𝒟' ⊆ 𝒟_i, the graph obtained from G_𝒟 by contracting all edges between the vertices in 𝒟_i $1𝒟' admits a tree decomposition in which each bag consists of O(p+|𝒟'|) cliques. Our theorem can be viewed as an analog for unit-disk graphs of the structural theorems for planar graphs and almost-embeddable graphs proved very recently by Marx et al. [SODA'22] and Bandyapadhyay et al. [SODA'22]. By applying our structural theorem, we give several new combinatorial and algorithmic results for unit-disk graphs. On the combinatorial side, we obtain the first Contraction Decomposition Theorem (CDT) for unit-disk graphs, resolving an open question in the work Panolan et al. [SODA'19]. On the algorithmic side, we obtain a new FPT algorithm for bipartization (also known as odd cycle transversal) on unit-disk graphs, which runs in 2^{O(√k log k)} ⋅ n^{O(1)} time, where k denotes the solution size. Our algorithm significantly improves the previous slightly subexponential-time algorithm given by Lokshtanov et al. [SODA'22] (which works more generally for disk graphs) and is almost optimal, as the problem cannot be solved in 2^{o(√k)} ⋅ n^{O(1)} time assuming the ETH.

Cite as

Sayan Bandyapadhyay, William Lochet, Daniel Lokshtanov, Saket Saurabh, and Jie Xue. True Contraction Decomposition and Almost ETH-Tight Bipartization for Unit-Disk Graphs. In 38th International Symposium on Computational Geometry (SoCG 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 224, pp. 11:1-11:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bandyapadhyay_et_al:LIPIcs.SoCG.2022.11,
  author =	{Bandyapadhyay, Sayan and Lochet, William and Lokshtanov, Daniel and Saurabh, Saket and Xue, Jie},
  title =	{{True Contraction Decomposition and Almost ETH-Tight Bipartization for Unit-Disk Graphs}},
  booktitle =	{38th International Symposium on Computational Geometry (SoCG 2022)},
  pages =	{11:1--11:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-227-3},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{224},
  editor =	{Goaoc, Xavier and Kerber, Michael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2022.11},
  URN =		{urn:nbn:de:0030-drops-160190},
  doi =		{10.4230/LIPIcs.SoCG.2022.11},
  annote =	{Keywords: unit-disk graphs, tree decomposition, contraction decomposition, bipartization}
}
Document
Dynamic Time Warping Under Translation: Approximation Guided by Space-Filling Curves

Authors: Karl Bringmann, Sándor Kisfaludi‑Bak, Marvin Künnemann, Dániel Marx, and André Nusser

Published in: LIPIcs, Volume 224, 38th International Symposium on Computational Geometry (SoCG 2022)


Abstract
The Dynamic Time Warping (DTW) distance is a popular measure of similarity for a variety of sequence data. For comparing polygonal curves π, σ in ℝ^d, it provides a robust, outlier-insensitive alternative to the Fréchet distance. However, like the Fréchet distance, the DTW distance is not invariant under translations. Can we efficiently optimize the DTW distance of π and σ under arbitrary translations, to compare the curves' shape irrespective of their absolute location? There are surprisingly few works in this direction, which may be due to its computational intricacy: For the Euclidean norm, this problem contains as a special case the geometric median problem, which provably admits no exact algebraic algorithm (that is, no algorithm using only addition, multiplication, and k-th roots). We thus investigate exact algorithms for non-Euclidean norms as well as approximation algorithms for the Euclidean norm. For the L₁ norm in ℝ^d, we provide an 𝒪(n^{2(d+1)})-time algorithm, i.e., an exact polynomial-time algorithm for constant d. Here and below, n bounds the curves' complexities. For the Euclidean norm in ℝ², we show that a simple problem-specific insight leads to a (1+ε)-approximation in time 𝒪(n³/ε²). We then show how to obtain a subcubic 𝒪̃(n^{2.5}/ε²) time algorithm with significant new ideas; this time comes close to the well-known quadratic time barrier for computing DTW for fixed translations. Technically, the algorithm is obtained by speeding up repeated DTW distance estimations using a dynamic data structure for maintaining shortest paths in weighted planar digraphs. Crucially, we show how to traverse a candidate set of translations using space-filling curves in a way that incurs only few updates to the data structure. We hope that our results will facilitate the use of DTW under translation both in theory and practice, and inspire similar algorithmic approaches for related geometric optimization problems.

Cite as

Karl Bringmann, Sándor Kisfaludi‑Bak, Marvin Künnemann, Dániel Marx, and André Nusser. Dynamic Time Warping Under Translation: Approximation Guided by Space-Filling Curves. In 38th International Symposium on Computational Geometry (SoCG 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 224, pp. 20:1-20:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bringmann_et_al:LIPIcs.SoCG.2022.20,
  author =	{Bringmann, Karl and Kisfaludi‑Bak, S\'{a}ndor and K\"{u}nnemann, Marvin and Marx, D\'{a}niel and Nusser, Andr\'{e}},
  title =	{{Dynamic Time Warping Under Translation: Approximation Guided by Space-Filling Curves}},
  booktitle =	{38th International Symposium on Computational Geometry (SoCG 2022)},
  pages =	{20:1--20:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-227-3},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{224},
  editor =	{Goaoc, Xavier and Kerber, Michael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2022.20},
  URN =		{urn:nbn:de:0030-drops-160287},
  doi =		{10.4230/LIPIcs.SoCG.2022.20},
  annote =	{Keywords: Dynamic Time Warping, Sequence Similarity Measures}
}
Document
Track A: Algorithms, Complexity and Games
Degrees and Gaps: Tight Complexity Results of General Factor Problems Parameterized by Treewidth and Cutwidth

Authors: Dániel Marx, Govind S. Sankar, and Philipp Schepper

Published in: LIPIcs, Volume 198, 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)


Abstract
In the General Factor problem, we are given an undirected graph G and for each vertex v ∈ V(G) a finite set B_v of non-negative integers. The task is to decide if there is a subset S ⊆ E(G) such that deg_S(v) ∈ B_v for all vertices v of G. Define the max-gap of a finite integer set B to be the largest d ≥ 0 such that there is an a ≥ 0 with [a,a+d+1] ∩ B = {a,a+d+1}. Cornuéjols showed in 1988 that if the max-gap of all sets B_v is at most 1, then the decision version of General Factor is polynomial-time solvable. This result was extended 2018 by Dudycz and Paluch for the optimization (i.e. minimization and maximization) versions. We present a general algorithm counting the number of solutions of a certain size in time #2 (M+1)^{tw}^{𝒪(1)}, given a tree decomposition of width tw, where M is the maximum integer over all B_v. By using convolution techniques from van Rooij (2020), we improve upon the previous (M+1)^{3tw}^𝒪(1) time algorithm by Arulselvan et al. from 2018. We prove that this algorithm is essentially optimal for all cases that are not trivial or polynomial time solvable for the decision, minimization or maximization versions. Our lower bounds show that such an improvement is not even possible for B-Factor, which is General Factor on graphs where all sets B_v agree with the fixed set B. We show that for every fixed B where the problem is NP-hard, our (max B+1)^tw^𝒪(1) algorithm cannot be significantly improved: assuming the Strong Exponential Time Hypothesis (SETH), no algorithm can solve B-Factor in time (max B+1-ε)^tw^𝒪(1) for any ε > 0. We extend this bound to the counting version of B-Factor for arbitrary, non-trivial sets B, assuming #SETH. We also investigate the parameterization of the problem by cutwidth. Unlike for treewidth, having a larger set B does not appear to make the problem harder: we give a 2^cutw^𝒪(1) algorithm for any B and provide a matching lower bound that this is optimal for the NP-hard cases.

Cite as

Dániel Marx, Govind S. Sankar, and Philipp Schepper. Degrees and Gaps: Tight Complexity Results of General Factor Problems Parameterized by Treewidth and Cutwidth. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 95:1-95:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.ICALP.2021.95,
  author =	{Marx, D\'{a}niel and Sankar, Govind S. and Schepper, Philipp},
  title =	{{Degrees and Gaps: Tight Complexity Results of General Factor Problems Parameterized by Treewidth and Cutwidth}},
  booktitle =	{48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)},
  pages =	{95:1--95:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-195-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{198},
  editor =	{Bansal, Nikhil and Merelli, Emanuela and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2021.95},
  URN =		{urn:nbn:de:0030-drops-141647},
  doi =		{10.4230/LIPIcs.ICALP.2021.95},
  annote =	{Keywords: General Factor, General Matching, Treewidth, Cutwidth}
}
Document
On the Computational Tractability of a Geographic Clustering Problem Arising in Redistricting

Authors: Vincent Cohen-Addad, Philip N. Klein, Dániel Marx, Archer Wheeler, and Christopher Wolfram

Published in: LIPIcs, Volume 192, 2nd Symposium on Foundations of Responsible Computing (FORC 2021)


Abstract
Redistricting is the problem of dividing up a state into a given number k of regions (called districts) where the voters in each district are to elect a representative. The three primary criteria are: that each district be connected, that the populations of the districts be equal (or nearly equal), and that the districts are "compact". There are multiple competing definitions of compactness, usually minimizing some quantity. One measure that has been recently been used is number of cut edges. In this formulation of redistricting, one is given atomic regions out of which each district must be built (e.g., in the U.S., census blocks). The populations of the atomic regions are given. Consider the graph with one vertex per atomic region and an edge between atomic regions with a shared boundary of positive length. Define the weight of a vertex to be the population of the corresponding region. A districting plan is a partition of vertices into k pieces so that the parts have nearly equal weights and each part is connected. The districts are considered compact to the extent that the plan minimizes the number of edges crossing between different parts. There are two natural computational problems: find the most compact districting plan, and sample districting plans (possibly under a compactness constraint) uniformly at random. Both problems are NP-hard so we consider restricting the input graph to have branchwidth at most w. (A planar graph’s branchwidth is bounded, for example, by its diameter.) If both k and w are bounded by constants, the problems are solvable in polynomial time. In this paper, we give lower and upper bounds that characterize the complexity of these problems in terms of parameters k and w. For simplicity of notation, assume that each vertex has unit weight. We would ideally like algorithms whose running times are of the form O(f(k,w) n^c) for some constant c independent of k and w (in which case the problems are said to be fixed-parameter tractable with respect to those parameters). We show that, under standard complexity-theoretic assumptions, no such algorithms exist. However, the problems are fixed-parameter tractable with respect to each of these parameters individually: there exist algorithms with running times of the form O(f(k) n^{O(w)}) and O(f(w) n^{k+1}). The first result was previously known. The new one, however, is more relevant to the application to redistricting, at least for coarse instances. Indeed, we have implemented a version of the algorithm and have used to successfully find optimally compact solutions to all redistricting instances for France (except Paris, which operates under different rules) under various population-balance constraints. For these instances, the values for w are modest and the values for k are very small.

Cite as

Vincent Cohen-Addad, Philip N. Klein, Dániel Marx, Archer Wheeler, and Christopher Wolfram. On the Computational Tractability of a Geographic Clustering Problem Arising in Redistricting. In 2nd Symposium on Foundations of Responsible Computing (FORC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 192, pp. 3:1-3:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{cohenaddad_et_al:LIPIcs.FORC.2021.3,
  author =	{Cohen-Addad, Vincent and Klein, Philip N. and Marx, D\'{a}niel and Wheeler, Archer and Wolfram, Christopher},
  title =	{{On the Computational Tractability of a Geographic Clustering Problem Arising in Redistricting}},
  booktitle =	{2nd Symposium on Foundations of Responsible Computing (FORC 2021)},
  pages =	{3:1--3:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-187-0},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{192},
  editor =	{Ligett, Katrina and Gupta, Swati},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FORC.2021.3},
  URN =		{urn:nbn:de:0030-drops-138718},
  doi =		{10.4230/LIPIcs.FORC.2021.3},
  annote =	{Keywords: redistricting, algorithms, planar graphs, lower bounds}
}
Document
Chordless Cycle Packing Is Fixed-Parameter Tractable

Authors: Dániel Marx

Published in: LIPIcs, Volume 173, 28th Annual European Symposium on Algorithms (ESA 2020)


Abstract
A chordless cycle or hole in a graph G is an induced cycle of length at least 4. In the Hole Packing problem, a graph G and an integer k is given, and the task is to find (if exists) a set of k pairwise vertex-disjoint chordless cycles. Our main result is showing that Hole Packing is fixed-parameter tractable (FPT), that is, can be solved in time f(k)n^O(1) for some function f depending only on k.

Cite as

Dániel Marx. Chordless Cycle Packing Is Fixed-Parameter Tractable. In 28th Annual European Symposium on Algorithms (ESA 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 173, pp. 71:1-71:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{marx:LIPIcs.ESA.2020.71,
  author =	{Marx, D\'{a}niel},
  title =	{{Chordless Cycle Packing Is Fixed-Parameter Tractable}},
  booktitle =	{28th Annual European Symposium on Algorithms (ESA 2020)},
  pages =	{71:1--71:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-162-7},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{173},
  editor =	{Grandoni, Fabrizio and Herman, Grzegorz and Sanders, Peter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2020.71},
  URN =		{urn:nbn:de:0030-drops-129373},
  doi =		{10.4230/LIPIcs.ESA.2020.71},
  annote =	{Keywords: chordal graphs, packing, fixed-parameter tractability}
}
Document
Incompressibility of H-Free Edge Modification Problems: Towards a Dichotomy

Authors: Dániel Marx and R. B. Sandeep

Published in: LIPIcs, Volume 173, 28th Annual European Symposium on Algorithms (ESA 2020)


Abstract
Given a graph G and an integer k, the H-free Edge Editing problem is to find whether there exist at most k pairs of vertices in G such that changing the adjacency of the pairs in G results in a graph without any induced copy of H. The existence of polynomial kernels for H-free Edge Editing (that is, whether it is possible to reduce the size of the instance to k^O(1) in polynomial time) received significant attention in the parameterized complexity literature. Nontrivial polynomial kernels are known to exist for some graphs H with at most 4 vertices (e.g., path on 3 or 4 vertices, diamond, paw), but starting from 5 vertices, polynomial kernels are known only if H is either complete or empty. This suggests the conjecture that there is no other H with at least 5 vertices were H-free Edge Editing admits a polynomial kernel. Towards this goal, we obtain a set ℋ of nine 5-vertex graphs such that if for every H ∈ ℋ, H-free Edge Editing is incompressible and the complexity assumption NP ⊈ coNP/poly holds, then H-free Edge Editing is incompressible for every graph H with at least five vertices that is neither complete nor empty. That is, proving incompressibility for these nine graphs would give a complete classification of the kernelization complexity of H-free Edge Editing for every H with at least 5 vertices. We obtain similar result also for H-free Edge Deletion. Here the picture is more complicated due to the existence of another infinite family of graphs H where the problem is trivial (graphs with exactly one edge). We obtain a larger set ℋ of nineteen graphs whose incompressibility would give a complete classification of the kernelization complexity of H-free Edge Deletion for every graph H with at least 5 vertices. Analogous results follow also for the H-free Edge Completion problem by simple complementation.

Cite as

Dániel Marx and R. B. Sandeep. Incompressibility of H-Free Edge Modification Problems: Towards a Dichotomy. In 28th Annual European Symposium on Algorithms (ESA 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 173, pp. 72:1-72:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.ESA.2020.72,
  author =	{Marx, D\'{a}niel and Sandeep, R. B.},
  title =	{{Incompressibility of H-Free Edge Modification Problems: Towards a Dichotomy}},
  booktitle =	{28th Annual European Symposium on Algorithms (ESA 2020)},
  pages =	{72:1--72:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-162-7},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{173},
  editor =	{Grandoni, Fabrizio and Herman, Grzegorz and Sanders, Peter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2020.72},
  URN =		{urn:nbn:de:0030-drops-129383},
  doi =		{10.4230/LIPIcs.ESA.2020.72},
  annote =	{Keywords: incompressibility, edge modification problems, H-free graphs}
}
Document
Full Complexity Classification of the List Homomorphism Problem for Bounded-Treewidth Graphs

Authors: Karolina Okrasa, Marta Piecyk, and Paweł Rzążewski

Published in: LIPIcs, Volume 173, 28th Annual European Symposium on Algorithms (ESA 2020)


Abstract
A homomorphism from a graph G to a graph H is an edge-preserving mapping from V(G) to V(H). Let H be a fixed graph with possible loops. In the list homomorphism problem, denoted by LHom(H), we are given a graph G, whose every vertex v is assigned with a list L(v) of vertices of H. We ask whether there exists a homomorphism h from G to H, which respects lists L, i.e., for every v ∈ V(G) it holds that h(v) ∈ L(v). The complexity dichotomy for LHom(H) was proven by Feder, Hell, and Huang [JGT 2003]. The authors showed that the problem is polynomial-time solvable if H belongs to the class called bi-arc graphs, and for all other graphs H it is NP-complete. We are interested in the complexity of the LHom(H) problem, parameterized by the treewidth of the input graph. This problem was investigated by Egri, Marx, and Rzążewski [STACS 2018], who obtained tight complexity bounds for the special case of reflexive graphs H, i.e., if every vertex has a loop. In this paper we extend and generalize their results for all relevant graphs H, i.e., those, for which the LHom(H) problem is NP-hard. For every such H we find a constant k = k(H), such that the LHom(H) problem on instances G with n vertices and treewidth t - can be solved in time k^t ⋅ n^𝒪(1), provided that G is given along with a tree decomposition of width t, - cannot be solved in time (k-ε)^t ⋅ n^𝒪(1), for any ε > 0, unless the SETH fails. For some graphs H the value of k(H) is much smaller than the trivial upper bound, i.e., |V(H)|. Obtaining matching upper and lower bounds shows that the set of algorithmic tools that we have discovered cannot be extended in order to obtain faster algorithms for LHom(H) in bounded-treewidth graphs. Furthermore, neither the algorithm, nor the proof of the lower bound, is very specific to treewidth. We believe that they can be used for other variants of the LHom(H) problem, e.g. with different parameterizations.

Cite as

Karolina Okrasa, Marta Piecyk, and Paweł Rzążewski. Full Complexity Classification of the List Homomorphism Problem for Bounded-Treewidth Graphs. In 28th Annual European Symposium on Algorithms (ESA 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 173, pp. 74:1-74:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{okrasa_et_al:LIPIcs.ESA.2020.74,
  author =	{Okrasa, Karolina and Piecyk, Marta and Rz\k{a}\.{z}ewski, Pawe{\l}},
  title =	{{Full Complexity Classification of the List Homomorphism Problem for Bounded-Treewidth Graphs}},
  booktitle =	{28th Annual European Symposium on Algorithms (ESA 2020)},
  pages =	{74:1--74:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-162-7},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{173},
  editor =	{Grandoni, Fabrizio and Herman, Grzegorz and Sanders, Peter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2020.74},
  URN =		{urn:nbn:de:0030-drops-129402},
  doi =		{10.4230/LIPIcs.ESA.2020.74},
  annote =	{Keywords: list homomorphisms, fine-grained complexity, SETH, treewidth}
}
  • Refine by Author
  • 49 Marx, Dániel
  • 6 Saurabh, Saket
  • 4 Bonnet, Édouard
  • 4 Lokshtanov, Daniel
  • 4 Marx, Daniel
  • Show More...

  • Refine by Classification
  • 8 Theory of computation → Fixed parameter tractability
  • 8 Theory of computation → Parameterized complexity and exact algorithms
  • 4 Theory of computation → Design and analysis of algorithms
  • 4 Theory of computation → Graph algorithms analysis
  • 4 Theory of computation → Problems, reductions and completeness
  • Show More...

  • Refine by Keyword
  • 6 fixed-parameter tractability
  • 6 treewidth
  • 4 Parameterized complexity
  • 4 Treewidth
  • 4 parameterized complexity
  • Show More...

  • Refine by Type
  • 69 document
  • 1 volume

  • Refine by Publication Year
  • 12 2018
  • 9 2016
  • 7 2019
  • 7 2022
  • 6 2007
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail