Search Results

Documents authored by Can Esmer, Barış


Document
Generalized Graph Packing Problems Parameterized by Treewidth

Authors: Barış Can Esmer and Dániel Marx

Published in: LIPIcs, Volume 351, 33rd Annual European Symposium on Algorithms (ESA 2025)


Abstract
H-Packing is the problem of finding a maximum number of vertex-disjoint copies of H in a given graph G. H-Partition is the special case of finding a set of vertex-disjoint copies that cover each vertex of G exactly once. Our goal is to study these problems and some generalizations on bounded-treewidth graphs. The case of H being a triangle is well understood: given a tree decomposition of G having treewidth tw, the K₃-Packing problem can be solved in time 2^tw⋅ n^O(1), while Lokshtanov et al. [ACM Transactions on Algorithms 2018] showed, under the Strong Exponential-Time Hypothesis (SETH), that there is no (2-ε)^tw⋅ n^O(1) algorithm for any ε > 0 even for K₃-Partition. Similar results can be obtained for any other clique K_d for d ≥ 3. We provide generalizations in two directions: - We consider a generalization of the problem where every vertex can be used at most c times for some c ≥ 1. When H is any clique K_d with d ≥ 3, then we give upper and lower bounds showing that the optimal running time increases to (c+1)^tw⋅ n^O(1). We consider two variants depending on whether a copy of H can be used multiple times in the packing. - If H is not a clique, then the dependence of the running time on treewidth may not be even single exponential. Specifically, we show that if H is any fixed graph where not every 2-connected component is a clique, then there is no 2^o(tw log tw)⋅ n^O(1) algorithm for H-Partition, assuming the Exponential-Time Hypothesis (ETH).

Cite as

Barış Can Esmer and Dániel Marx. Generalized Graph Packing Problems Parameterized by Treewidth. In 33rd Annual European Symposium on Algorithms (ESA 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 351, pp. 3:1-3:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{canesmer_et_al:LIPIcs.ESA.2025.3,
  author =	{Can Esmer, Bar{\i}\c{s} and Marx, D\'{a}niel},
  title =	{{Generalized Graph Packing Problems Parameterized by Treewidth}},
  booktitle =	{33rd Annual European Symposium on Algorithms (ESA 2025)},
  pages =	{3:1--3:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-395-9},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{351},
  editor =	{Benoit, Anne and Kaplan, Haim and Wild, Sebastian and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2025.3},
  URN =		{urn:nbn:de:0030-drops-244713},
  doi =		{10.4230/LIPIcs.ESA.2025.3},
  annote =	{Keywords: Graph Packing, Graph Partitioning, Parameterized Complexity, Treewidth, Pathwidth, pw-SETH, Single-Exponential Lower Bound, Slightly Superexponential Lower Bound}
}
Document
Track A: Algorithms, Complexity and Games
Sampling with a Black Box: Faster Parameterized Approximation Algorithms for Vertex Deletion Problems

Authors: Barış Can Esmer and Ariel Kulik

Published in: LIPIcs, Volume 334, 52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025)


Abstract
In this paper, we present Sampling with a Black Box, a unified framework for the design of parameterized approximation algorithms for vertex deletion problems (e.g., Vertex Cover, Feedback Vertex Set, etc.). The framework relies on two components: - A Sampling Step. A polynomial-time randomized algorithm that, given a graph G, returns a random vertex v such that the optimum of G⧵ {v} is smaller by 1 than the optimum of G, with some prescribed probability q. We show that such algorithms exist for multiple vertex deletion problems. - A Black Box algorithm which is either an exact parameterized algorithm, a polynomial-time approximation algorithm, or a parameterized-approximation algorithm. The framework combines these two components together. The sampling step is applied iteratively to remove vertices from the input graph, and then the solution is extended using the black box algorithm. The process is repeated sufficiently many times so that the target approximation ratio is attained with a constant probability. We use the technique to derive parameterized approximation algorithms for several vertex deletion problems, including Feedback Vertex Set, d-Hitting Set and 𝓁-Path Vertex Cover. In particular, for every approximation ratio 1 < β < 2, we attain a parameterized β-approximation for Feedback Vertex Set, which is faster than the parameterized β-approximation of [Jana, Lokshtanov, Mandal, Rai and Saurabh, MFCS 23']. Furthermore, our algorithms are always faster than the algorithms attained using Fidelity Preserving Transformations [Fellows, Kulik, Rosamond, and Shachnai, JCSS 18'].

Cite as

Barış Can Esmer and Ariel Kulik. Sampling with a Black Box: Faster Parameterized Approximation Algorithms for Vertex Deletion Problems. In 52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025). Leibniz International Proceedings in Informatics (LIPIcs), Volume 334, pp. 39:1-39:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2025)


Copy BibTex To Clipboard

@InProceedings{canesmer_et_al:LIPIcs.ICALP.2025.39,
  author =	{Can Esmer, Bar{\i}\c{s} and Kulik, Ariel},
  title =	{{Sampling with a Black Box: Faster Parameterized Approximation Algorithms for Vertex Deletion Problems}},
  booktitle =	{52nd International Colloquium on Automata, Languages, and Programming (ICALP 2025)},
  pages =	{39:1--39:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-372-0},
  ISSN =	{1868-8969},
  year =	{2025},
  volume =	{334},
  editor =	{Censor-Hillel, Keren and Grandoni, Fabrizio and Ouaknine, Jo\"{e}l and Puppis, Gabriele},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2025.39},
  URN =		{urn:nbn:de:0030-drops-234165},
  doi =		{10.4230/LIPIcs.ICALP.2025.39},
  annote =	{Keywords: Parameterized Approximation Algorithms, Random Sampling}
}
Document
List Homomorphisms by Deleting Edges and Vertices: Tight Complexity Bounds for Bounded-Treewidth Graphs

Authors: Barış Can Esmer, Jacob Focke, Dániel Marx, and Paweł Rzążewski

Published in: LIPIcs, Volume 308, 32nd Annual European Symposium on Algorithms (ESA 2024)


Abstract
The goal of this paper is to investigate a family of optimization problems arising from list homomorphisms, and to understand what the best possible algorithms are if we restrict the problem to bounded-treewidth graphs. Given graphs G, H, and lists L(v) ⊆ V(H) for every v ∈ V(G), a list homomorphism from (G,L) to H is a function f:V(G) → V(H) that preserves the edges (i.e., uv ∈ E(G) implies f(u)f(v) ∈ E(H)) and respects the lists (i.e., f(v) ∈ L(v)). The graph H may have loops. For a fixed H, the input of the optimization problem LHomVD(H) is a graph G with lists L(v), and the task is to find a set X of vertices having minimum size such that (G-X,L) has a list homomorphism to H. We define analogously the edge-deletion variant LHomED(H), where we have to delete as few edges as possible from G to obtain a graph that has a list homomorphism. This expressive family of problems includes members that are essentially equivalent to fundamental problems such as Vertex Cover, Max Cut, Odd Cycle Transversal, and Edge/Vertex Multiway Cut. For both variants, we first characterize those graphs H that make the problem polynomial-time solvable and show that the problem is NP-hard for every other fixed H. Second, as our main result, we determine for every graph H for which the problem is NP-hard, the smallest possible constant c_H such that the problem can be solved in time c^t_H⋅ n^{𝒪(1)} if a tree decomposition of G having width t is given in the input. Let i(H) be the maximum size of a set of vertices in H that have pairwise incomparable neighborhoods. For the vertex-deletion variant LHomVD(H), we show that the smallest possible constant is i(H)+1 for every H: - Given a tree decomposition of width t of G, LHomVD(H) can be solved in time (i(H)+1)^t⋅ n^{𝒪(1)}. - For any ε > 0 and H, an (i(H)+1-ε)^t⋅ n^{𝒪(1)} algorithm would violate the Strong Exponential-Time Hypothesis (SETH). The situation is more complex for the edge-deletion version. For every H, one can solve LHomED(H) in time i(H)^t⋅ n^{𝒪(1)} if a tree decomposition of width t is given. However, the existence of a specific type of decomposition of H shows that there are graphs H where LHomED(H) can be solved significantly more efficiently and the best possible constant can be arbitrarily smaller than i(H). Nevertheless, we determine this best possible constant and (assuming the SETH) prove tight bounds for every fixed H.

Cite as

Barış Can Esmer, Jacob Focke, Dániel Marx, and Paweł Rzążewski. List Homomorphisms by Deleting Edges and Vertices: Tight Complexity Bounds for Bounded-Treewidth Graphs. In 32nd Annual European Symposium on Algorithms (ESA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 308, pp. 39:1-39:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{canesmer_et_al:LIPIcs.ESA.2024.39,
  author =	{Can Esmer, Bar{\i}\c{s} and Focke, Jacob and Marx, D\'{a}niel and Rz\k{a}\.{z}ewski, Pawe{\l}},
  title =	{{List Homomorphisms by Deleting Edges and Vertices: Tight Complexity Bounds for Bounded-Treewidth Graphs}},
  booktitle =	{32nd Annual European Symposium on Algorithms (ESA 2024)},
  pages =	{39:1--39:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-338-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{308},
  editor =	{Chan, Timothy and Fischer, Johannes and Iacono, John and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2024.39},
  URN =		{urn:nbn:de:0030-drops-211103},
  doi =		{10.4230/LIPIcs.ESA.2024.39},
  annote =	{Keywords: Graph Homomorphism, List Homomorphism, Vertex Deletion, Edge Deletion, Multiway Cut, Parameterized Complexity, Tight Bounds, Treewidth, SETH}
}
Document
Track A: Algorithms, Complexity and Games
Fundamental Problems on Bounded-Treewidth Graphs: The Real Source of Hardness

Authors: Barış Can Esmer, Jacob Focke, Dániel Marx, and Paweł Rzążewski

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
It is known for many algorithmic problems that if a tree decomposition of width t is given in the input, then the problem can be solved with exponential dependence on t. A line of research initiated by Lokshtanov, Marx, and Saurabh [SODA 2011] produced lower bounds showing that in many cases known algorithms already achieve the best possible exponential dependence on t, assuming the Strong Exponential-Time Hypothesis (SETH). The main message of this paper is showing that the same lower bounds can already be obtained in a much more restricted setting: informally, a graph consisting of a block of t vertices connected to components of constant size already has the same hardness as a general tree decomposition of width t. Formally, a (σ,δ)-hub is a set Q of vertices such that every component of Q has size at most σ and is adjacent to at most δ vertices of Q. We explore if the known tight lower bounds parameterized by the width of the given tree decomposition remain valid if we parameterize by the size of the given hub. - For every ε > 0, there are σ,δ > 0 such that Independent Set (equivalently Vertex Cover) cannot be solved in time (2-ε)^p⋅ n, even if a (σ, δ)-hub of size p is given in the input, assuming the SETH. This matches the earlier tight lower bounds parameterized by width of the tree decomposition. Similar tight bounds are obtained for Odd Cycle Transversal, Max Cut, q-Coloring, and edge/vertex deletions versions of q-Coloring. - For every ε > 0, there are σ,δ > 0 such that △-Partition cannot be solved in time (2-ε)^p ⋅ n, even if a (σ, δ)-hub of size p is given in the input, assuming the Set Cover Conjecture (SCC). In fact, we prove that this statement is equivalent to the SCC, thus it is unlikely that this could be proved assuming the SETH. - For Dominating Set, we can prove a non-tight lower bound ruling out (2-ε)^p ⋅ n^𝒪(1) algorithms, assuming either the SETH or the SCC, but this does not match the 3^p⋅ n^{𝒪(1)} upper bound. Thus our results reveal that, for many problems, the research on lower bounds on the dependence on tree width was never really about tree decompositions, but the real source of hardness comes from a much simpler structure. Additionally, we study if the same lower bounds can be obtained if σ and δ are fixed universal constants (not depending on ε). We show that lower bounds of this form are possible for Max Cut and the edge-deletion version of q-Coloring, under the Max 3-Sat Hypothesis (M3SH). However, no such lower bounds are possible for Independent Set, Odd Cycle Transversal, and the vertex-deletion version of q-Coloring: better than brute force algorithms are possible for every fixed (σ,δ).

Cite as

Barış Can Esmer, Jacob Focke, Dániel Marx, and Paweł Rzążewski. Fundamental Problems on Bounded-Treewidth Graphs: The Real Source of Hardness. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 34:1-34:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{canesmer_et_al:LIPIcs.ICALP.2024.34,
  author =	{Can Esmer, Bar{\i}\c{s} and Focke, Jacob and Marx, D\'{a}niel and Rz\k{a}\.{z}ewski, Pawe{\l}},
  title =	{{Fundamental Problems on Bounded-Treewidth Graphs: The Real Source of Hardness}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{34:1--34:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.34},
  URN =		{urn:nbn:de:0030-drops-201772},
  doi =		{10.4230/LIPIcs.ICALP.2024.34},
  annote =	{Keywords: Parameterized Complexity, Tight Bounds, Hub, Treewidth, Strong Exponential Time Hypothesis, Vertex Coloring, Vertex Deletion, Edge Deletion, Triangle Packing, Triangle Partition, Set Cover Hypothesis, Dominating Set}
}
Any Issues?
X

Feedback on the Current Page

CAPTCHA

Thanks for your feedback!

Feedback submitted to Dagstuhl Publishing

Could not send message

Please try again later or send an E-mail