Search Results

Documents authored by Nichterlein, André


Document
Destroying Densest Subgraphs Is Hard

Authors: Cristina Bazgan, André Nichterlein, and Sofia Vazquez Alferez

Published in: LIPIcs, Volume 294, 19th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2024)


Abstract
We analyze the computational complexity of the following computational problems called Bounded-Density Edge Deletion and Bounded-Density Vertex Deletion: Given a graph G, a budget k and a target density τ_ρ, are there k edges (k vertices) whose removal from G results in a graph where the densest subgraph has density at most τ_ρ? Here, the density of a graph is the number of its edges divided by the number of its vertices. We prove that both problems are polynomial-time solvable on trees and cliques but are NP-complete on planar bipartite graphs and split graphs. From a parameterized point of view, we show that both problems are fixed-parameter tractable with respect to the vertex cover number but W[1]-hard with respect to the solution size. Furthermore, we prove that Bounded-Density Edge Deletion is W[1]-hard with respect to the feedback edge number, demonstrating that the problem remains hard on very sparse graphs.

Cite as

Cristina Bazgan, André Nichterlein, and Sofia Vazquez Alferez. Destroying Densest Subgraphs Is Hard. In 19th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 294, pp. 6:1-6:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{bazgan_et_al:LIPIcs.SWAT.2024.6,
  author =	{Bazgan, Cristina and Nichterlein, Andr\'{e} and Vazquez Alferez, Sofia},
  title =	{{Destroying Densest Subgraphs Is Hard}},
  booktitle =	{19th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2024)},
  pages =	{6:1--6:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-318-8},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{294},
  editor =	{Bodlaender, Hans L.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SWAT.2024.6},
  URN =		{urn:nbn:de:0030-drops-200461},
  doi =		{10.4230/LIPIcs.SWAT.2024.6},
  annote =	{Keywords: Graph modification problems, NP-hardness, fixed-parameter tractability, W-hardness, special graph classes}
}
Document
Graph Clustering Problems Under the Lens of Parameterized Local Search

Authors: Jaroslav Garvardt, Nils Morawietz, André Nichterlein, and Mathias Weller

Published in: LIPIcs, Volume 285, 18th International Symposium on Parameterized and Exact Computation (IPEC 2023)


Abstract
Cluster Editing is the problem of finding the minimum number of edge-modifications that transform a given graph G into a cluster graph G', that is, each connected component of G' is a clique. Similarly, in the Cluster Deletion problem, we further restrict the sought cluster graph G' to contain only edges that are also present in G. In this work, we consider the parameterized complexity of a local search variant for both problems: LS Cluster Deletion and LS Cluster Editing. Herein, the input also comprises an integer k and a partition 𝒞 of the vertex set of G that describes an initial cluster graph G^*, and we are to decide whether the "k-move-neighborhood" of G^* contains a cluster graph G' that is "better" (uses less modifications) than G^*. Roughly speaking, two cluster graphs G₁ and G₂ are k-move-neighbors if G₁ can be obtained from G₂ by moving at most k vertices to different connected components. We consider parameterizations by k + 𝓁 for some natural parameters 𝓁, such as the number of clusters in 𝒞, the size of a largest cluster in 𝒞, or the cluster-vertex-deletion number (cvd) of G. Our main lower-bound results are that LS Cluster Editing is W[1]-hard when parameterized by k even if 𝒞 has size two and that both LS Cluster Deletion and LS Cluster Editing are W[1]-hard when parameterized by k + 𝓁, where 𝓁 is the size of the largest cluster of 𝒞. On the positive side, we show that both problems admit an algorithm that runs in k^{𝒪(k)}⋅ cvd^{3k} ⋅ n^{𝒪(1)} time and either finds a better cluster graph or correctly outputs that there is no better cluster graph in the k-move-neighborhood of the initial cluster graph. As an intermediate result, we also obtain an algorithm that solves Cluster Deletion in cvd^{cvd} ⋅ n^{𝒪(1)} time.

Cite as

Jaroslav Garvardt, Nils Morawietz, André Nichterlein, and Mathias Weller. Graph Clustering Problems Under the Lens of Parameterized Local Search. In 18th International Symposium on Parameterized and Exact Computation (IPEC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 285, pp. 20:1-20:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{garvardt_et_al:LIPIcs.IPEC.2023.20,
  author =	{Garvardt, Jaroslav and Morawietz, Nils and Nichterlein, Andr\'{e} and Weller, Mathias},
  title =	{{Graph Clustering Problems Under the Lens of Parameterized Local Search}},
  booktitle =	{18th International Symposium on Parameterized and Exact Computation (IPEC 2023)},
  pages =	{20:1--20:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-305-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{285},
  editor =	{Misra, Neeldhara and Wahlstr\"{o}m, Magnus},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2023.20},
  URN =		{urn:nbn:de:0030-drops-194391},
  doi =		{10.4230/LIPIcs.IPEC.2023.20},
  annote =	{Keywords: parameterized local search, permissive local search, FPT, W\lbrack1\rbrack-hardness}
}
Document
Correlating Theory and Practice in Finding Clubs and Plexes

Authors: Aleksander Figiel, Tomohiro Koana, André Nichterlein, and Niklas Wünsche

Published in: LIPIcs, Volume 274, 31st Annual European Symposium on Algorithms (ESA 2023)


Abstract
For solving NP-hard problems there is often a huge gap between theoretical guarantees and observed running times on real-world instances. As a first step towards tackling this issue, we propose an approach to quantify the correlation between theoretical and observed running times. We use two NP-hard problems related to finding large "cliquish" subgraphs in a given graph as demonstration of this measure. More precisely, we focus on finding maximum s-clubs and s-plexes, i. e., graphs of diameter s and graphs where each vertex is adjacent to all but s vertices. Preprocessing based on Turing kernelization is a standard tool to tackle these problems, especially on sparse graphs. We provide a parameterized analysis for the Turing kernelization and demonstrate their usefulness in practice. Moreover, we demonstrate that our measure indeed captures the correlation between these new theoretical and the observed running times.

Cite as

Aleksander Figiel, Tomohiro Koana, André Nichterlein, and Niklas Wünsche. Correlating Theory and Practice in Finding Clubs and Plexes. In 31st Annual European Symposium on Algorithms (ESA 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 274, pp. 47:1-47:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{figiel_et_al:LIPIcs.ESA.2023.47,
  author =	{Figiel, Aleksander and Koana, Tomohiro and Nichterlein, Andr\'{e} and W\"{u}nsche, Niklas},
  title =	{{Correlating Theory and Practice in Finding Clubs and Plexes}},
  booktitle =	{31st Annual European Symposium on Algorithms (ESA 2023)},
  pages =	{47:1--47:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-295-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{274},
  editor =	{G{\o}rtz, Inge Li and Farach-Colton, Martin and Puglisi, Simon J. and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2023.47},
  URN =		{urn:nbn:de:0030-drops-187000},
  doi =		{10.4230/LIPIcs.ESA.2023.47},
  annote =	{Keywords: Preprocessing, Turing kernelization, Pearson correlation coefficient}
}
Document
Parameterized Lower Bounds for Problems in P via Fine-Grained Cross-Compositions

Authors: Klaus Heeger, André Nichterlein, and Rolf Niedermeier

Published in: LIPIcs, Volume 254, 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)


Abstract
We provide a general framework to exclude parameterized running times of the form O(l^β + n^γ) for problems that have polynomial running time lower bounds under hypotheses from fine-grained complexity. Our framework is based on cross-compositions from parameterized complexity. We (conditionally) exclude running times of the form O(l^{γ/(γ-1) - ε} + n^γ) for any 1 < γ < 2 and ε > 0 for the following problems: - Longest Common (Increasing) Subsequence: Given two length-n strings over an alphabet Σ (over ℕ) and l ∈ ℕ, is there a common (increasing) subsequence of length l in both strings? - Discrete Fréchet Distance: Given two lists of n points each and k ∈ N, is the Fréchet distance of the lists at most k? Here l is the maximum number of points which one list is ahead of the other list in an optimum traversal. - Planar Motion Planning: Given a set of n non-intersecting axis-parallel line segment obstacles in the plane and a line segment robot (called rod), can the rod be moved to a specified target without touching any obstacles? Here l is the maximum number of segments any segment has in its vicinity. Moreover, we exclude running times O(l^{2γ/(γ-1) - ε} + n^γ) for any 1 < γ < 3 and ε > 0 for: - Negative Triangle: Given an edge-weighted graph with n vertices, is there a triangle whose sum of edge-weights is negative? Here l is the order of a maximum connected component. - Triangle Collection: Given a vertex-colored graph with n vertices, is there for each triple of colors a triangle whose vertices have these three colors? Here l is the order of a maximum connected component. - 2nd Shortest Path: Given an n-vertex edge-weighted digraph, vertices s and t, and k ∈ ℕ, has the second longest s-t-path length at most k? Here l is the directed feedback vertex set number. Except for 2nd Shortest Path all these running time bounds are tight, that is, algorithms with running time O(l^{γ/(γ-1)} + n^γ) for any 1 < γ < 2 and O(l^{2γ/(γ -1)} + n^γ) for any 1 < γ < 3, respectively, are known. Our running time lower bounds also imply lower bounds on kernelization algorithms for these problems.

Cite as

Klaus Heeger, André Nichterlein, and Rolf Niedermeier. Parameterized Lower Bounds for Problems in P via Fine-Grained Cross-Compositions. In 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 254, pp. 35:1-35:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{heeger_et_al:LIPIcs.STACS.2023.35,
  author =	{Heeger, Klaus and Nichterlein, Andr\'{e} and Niedermeier, Rolf},
  title =	{{Parameterized Lower Bounds for Problems in P via Fine-Grained Cross-Compositions}},
  booktitle =	{40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)},
  pages =	{35:1--35:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-266-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{254},
  editor =	{Berenbrink, Petra and Bouyer, Patricia and Dawar, Anuj and Kant\'{e}, Mamadou Moustapha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2023.35},
  URN =		{urn:nbn:de:0030-drops-176876},
  doi =		{10.4230/LIPIcs.STACS.2023.35},
  annote =	{Keywords: FPT in P, Kernelization, Decomposition}
}
Document
Applying a Cut-Based Data Reduction Rule for Weighted Cluster Editing in Polynomial Time

Authors: Hjalmar Schulz, André Nichterlein, Rolf Niedermeier, and Christopher Weyand

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
Given an undirected graph, the task in Cluster Editing is to insert and delete a minimum number of edges to obtain a cluster graph, that is, a disjoint union of cliques. In the weighted variant each vertex pair comes with a weight and the edge modifications have to be of minimum overall weight. In this work, we provide the first polynomial-time algorithm to apply the following data reduction rule of Böcker et al. [Algorithmica, 2011] for Weighted Cluster Editing: For a graph G = (V,E), merge a vertex set S ⊆ V into a single vertex if the minimum cut of G[S] is at least the combined cost of inserting all missing edges within G[S] plus the cost of cutting all edges from S to the rest of the graph. Complementing our theoretical findings, we experimentally demonstrate the effectiveness of the data reduction rule, shrinking real-world test instances from the PACE Challenge 2021 by around 24% while previous heuristic implementations of the data reduction rule only achieve 8%.

Cite as

Hjalmar Schulz, André Nichterlein, Rolf Niedermeier, and Christopher Weyand. Applying a Cut-Based Data Reduction Rule for Weighted Cluster Editing in Polynomial Time. In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 25:1-25:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{schulz_et_al:LIPIcs.IPEC.2022.25,
  author =	{Schulz, Hjalmar and Nichterlein, Andr\'{e} and Niedermeier, Rolf and Weyand, Christopher},
  title =	{{Applying a Cut-Based Data Reduction Rule for Weighted Cluster Editing in Polynomial Time}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{25:1--25:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.25},
  URN =		{urn:nbn:de:0030-drops-173816},
  doi =		{10.4230/LIPIcs.IPEC.2022.25},
  annote =	{Keywords: Correlation Clustering, Minimum Cut, Maximum s-t-Flow}
}
Document
There and Back Again: On Applying Data Reduction Rules by Undoing Others

Authors: Aleksander Figiel, Vincent Froese, André Nichterlein, and Rolf Niedermeier

Published in: LIPIcs, Volume 244, 30th Annual European Symposium on Algorithms (ESA 2022)


Abstract
Data reduction rules are an established method in the algorithmic toolbox for tackling computationally challenging problems. A data reduction rule is a polynomial-time algorithm that, given a problem instance as input, outputs an equivalent, typically smaller instance of the same problem. The application of data reduction rules during the preprocessing of problem instances allows in many cases to considerably shrink their size, or even solve them directly. Commonly, these data reduction rules are applied exhaustively and in some fixed order to obtain irreducible instances. It was often observed that by changing the order of the rules, different irreducible instances can be obtained. We propose to "undo" data reduction rules on irreducible instances, by which they become larger, and then subsequently apply data reduction rules again to shrink them. We show that this somewhat counter-intuitive approach can lead to significantly smaller irreducible instances. The process of undoing data reduction rules is not limited to "rolling back" data reduction rules applied to the instance during preprocessing. Instead, we formulate so-called backward rules, which essentially undo a data reduction rule, but without using any information about which data reduction rules were applied to it previously. In particular, based on the example of Vertex Cover we propose two methods applying backward rules to shrink the instances further. In our experiments we show that this way smaller irreducible instances consisting of real-world graphs from the SNAP and DIMACS datasets can be computed.

Cite as

Aleksander Figiel, Vincent Froese, André Nichterlein, and Rolf Niedermeier. There and Back Again: On Applying Data Reduction Rules by Undoing Others. In 30th Annual European Symposium on Algorithms (ESA 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 244, pp. 53:1-53:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{figiel_et_al:LIPIcs.ESA.2022.53,
  author =	{Figiel, Aleksander and Froese, Vincent and Nichterlein, Andr\'{e} and Niedermeier, Rolf},
  title =	{{There and Back Again: On Applying Data Reduction Rules by Undoing Others}},
  booktitle =	{30th Annual European Symposium on Algorithms (ESA 2022)},
  pages =	{53:1--53:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-247-1},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{244},
  editor =	{Chechik, Shiri and Navarro, Gonzalo and Rotenberg, Eva and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2022.53},
  URN =		{urn:nbn:de:0030-drops-169914},
  doi =		{10.4230/LIPIcs.ESA.2022.53},
  annote =	{Keywords: Kernelization, Preprocessing, Vertex Cover}
}
Document
Covering Many (Or Few) Edges with k Vertices in Sparse Graphs

Authors: Tomohiro Koana, Christian Komusiewicz, André Nichterlein, and Frank Sommer

Published in: LIPIcs, Volume 219, 39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022)


Abstract
We study the following two fixed-cardinality optimization problems (a maximization and a minimization variant). For a fixed α between zero and one we are given a graph and two numbers k ∈ ℕ and t ∈ ℚ. The task is to find a vertex subset S of exactly k vertices that has value at least (resp. at most for minimization) t. Here, the value of a vertex set computes as α times the number of edges with exactly one endpoint in S plus 1-α times the number of edges with both endpoints in S. These two problems generalize many prominent graph problems, such as Densest k-Subgraph, Sparsest k-Subgraph, Partial Vertex Cover, and Max (k,n-k)-Cut. In this work, we complete the picture of their parameterized complexity on several types of sparse graphs that are described by structural parameters. In particular, we provide kernelization algorithms and kernel lower bounds for these problems. A somewhat surprising consequence of our kernelizations is that Partial Vertex Cover and Max (k,n-k)-Cut not only behave in the same way but that the kernels for both problems can be obtained by the same algorithms.

Cite as

Tomohiro Koana, Christian Komusiewicz, André Nichterlein, and Frank Sommer. Covering Many (Or Few) Edges with k Vertices in Sparse Graphs. In 39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 219, pp. 42:1-42:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{koana_et_al:LIPIcs.STACS.2022.42,
  author =	{Koana, Tomohiro and Komusiewicz, Christian and Nichterlein, Andr\'{e} and Sommer, Frank},
  title =	{{Covering Many (Or Few) Edges with k Vertices in Sparse Graphs}},
  booktitle =	{39th International Symposium on Theoretical Aspects of Computer Science (STACS 2022)},
  pages =	{42:1--42:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-222-8},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{219},
  editor =	{Berenbrink, Petra and Monmege, Benjamin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2022.42},
  URN =		{urn:nbn:de:0030-drops-158525},
  doi =		{10.4230/LIPIcs.STACS.2022.42},
  annote =	{Keywords: Parameterized Complexity, Kernelization, Partial Vertex Cover, Densest k-Subgraph, Max (k,n-k)-Cut, Degeneracy}
}
Document
The PACE 2021 Parameterized Algorithms and Computational Experiments Challenge: Cluster Editing

Authors: Leon Kellerhals, Tomohiro Koana, André Nichterlein, and Philipp Zschoche

Published in: LIPIcs, Volume 214, 16th International Symposium on Parameterized and Exact Computation (IPEC 2021)


Abstract
The Parameterized Algorithms and Computational Experiments challenge (PACE) 2021 was devoted to engineer algorithms solving the NP-hard Cluster Editing problem, also known as Correlation Clustering: Given an undirected graph the task is to compute a minimum number of edges to insert or remove in a way that the resulting graph is a cluster graph, that is, a graph in which each connected component is a clique. Altogether 67 participants from 21 teams, 11 countries, and 3 continents submitted their implementations to the competition. In this report, we describe the setup of the challenge, the selection of benchmark instances, and the ranking of the participating teams. We also briefly discuss the approaches used in the submitted solvers.

Cite as

Leon Kellerhals, Tomohiro Koana, André Nichterlein, and Philipp Zschoche. The PACE 2021 Parameterized Algorithms and Computational Experiments Challenge: Cluster Editing. In 16th International Symposium on Parameterized and Exact Computation (IPEC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 214, pp. 26:1-26:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{kellerhals_et_al:LIPIcs.IPEC.2021.26,
  author =	{Kellerhals, Leon and Koana, Tomohiro and Nichterlein, Andr\'{e} and Zschoche, Philipp},
  title =	{{The PACE 2021 Parameterized Algorithms and Computational Experiments Challenge: Cluster Editing}},
  booktitle =	{16th International Symposium on Parameterized and Exact Computation (IPEC 2021)},
  pages =	{26:1--26:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-216-7},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{214},
  editor =	{Golovach, Petr A. and Zehavi, Meirav},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2021.26},
  URN =		{urn:nbn:de:0030-drops-154096},
  doi =		{10.4230/LIPIcs.IPEC.2021.26},
  annote =	{Keywords: Correlation Clustering, Cluster Editing, Algorithm Engineering, FPT, Kernelization, Heuristics}
}
Document
Track A: Algorithms, Complexity and Games
Using a Geometric Lens to Find k Disjoint Shortest Paths

Authors: Matthias Bentert, André Nichterlein, Malte Renken, and Philipp Zschoche

Published in: LIPIcs, Volume 198, 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)


Abstract
Given an undirected n-vertex graph and k pairs of terminal vertices (s_1,t_1), …, (s_k,t_k), the k-Disjoint Shortest Paths (k-DSP) problem asks whether there are k pairwise vertex-disjoint paths P_1, …, P_k such that P_i is a shortest s_i-t_i-path for each i ∈ [k]. Recently, Lochet [SODA '21] provided an algorithm that solves k-DSP in n^{O(k^{5^k})} time, answering a 20-year old question about the computational complexity of k-DSP for constant k. On the one hand, we present an improved O(kn^{16k ⋅ k! + k + 1})-time algorithm based on a novel geometric view on this problem. For the special case k = 2, we show that the running time can be further reduced to O(nm) by small modifications of the algorithm and a further refined analysis. On the other hand, we show that k-DSP is W[1]-hard with respect to k, showing that the dependency of the degree of the polynomial running time on the parameter k is presumably unavoidable.

Cite as

Matthias Bentert, André Nichterlein, Malte Renken, and Philipp Zschoche. Using a Geometric Lens to Find k Disjoint Shortest Paths. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 26:1-26:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{bentert_et_al:LIPIcs.ICALP.2021.26,
  author =	{Bentert, Matthias and Nichterlein, Andr\'{e} and Renken, Malte and Zschoche, Philipp},
  title =	{{Using a Geometric Lens to Find k Disjoint Shortest Paths}},
  booktitle =	{48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)},
  pages =	{26:1--26:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-195-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{198},
  editor =	{Bansal, Nikhil and Merelli, Emanuela and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2021.26},
  URN =		{urn:nbn:de:0030-drops-140954},
  doi =		{10.4230/LIPIcs.ICALP.2021.26},
  annote =	{Keywords: graph algorithms, dynamic programming, W\lbrack1\rbrack-hardness, geometry}
}
Document
An Adaptive Version of Brandes' Algorithm for Betweenness Centrality

Authors: Matthias Bentert, Alexander Dittmann, Leon Kellerhals, André Nichterlein, and Rolf Niedermeier

Published in: LIPIcs, Volume 123, 29th International Symposium on Algorithms and Computation (ISAAC 2018)


Abstract
Betweenness centrality - measuring how many shortest paths pass through a vertex - is one of the most important network analysis concepts for assessing the relative importance of a vertex. The well-known algorithm of Brandes [2001] computes, on an n-vertex and m-edge graph, the betweenness centrality of all vertices in O(nm) worst-case time. In follow-up work, significant empirical speedups were achieved by preprocessing degree-one vertices and by graph partitioning based on cut vertices. We further contribute an algorithmic treatment of degree-two vertices, which turns out to be much richer in mathematical structure than the case of degree-one vertices. Based on these three algorithmic ingredients, we provide a strengthened worst-case running time analysis for betweenness centrality algorithms. More specifically, we prove an adaptive running time bound O(kn), where k < m is the size of a minimum feedback edge set of the input graph.

Cite as

Matthias Bentert, Alexander Dittmann, Leon Kellerhals, André Nichterlein, and Rolf Niedermeier. An Adaptive Version of Brandes' Algorithm for Betweenness Centrality. In 29th International Symposium on Algorithms and Computation (ISAAC 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 123, pp. 36:1-36:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{bentert_et_al:LIPIcs.ISAAC.2018.36,
  author =	{Bentert, Matthias and Dittmann, Alexander and Kellerhals, Leon and Nichterlein, Andr\'{e} and Niedermeier, Rolf},
  title =	{{An Adaptive Version of Brandes' Algorithm for Betweenness Centrality}},
  booktitle =	{29th International Symposium on Algorithms and Computation (ISAAC 2018)},
  pages =	{36:1--36:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-094-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{123},
  editor =	{Hsu, Wen-Lian and Lee, Der-Tsai and Liao, Chung-Shou},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2018.36},
  URN =		{urn:nbn:de:0030-drops-99846},
  doi =		{10.4230/LIPIcs.ISAAC.2018.36},
  annote =	{Keywords: network science, social network analysis, centrality measures, shortest paths, tree-like graphs, efficient pre- and postprocessing, FPT in P}
}
Document
Parameterized Dynamic Cluster Editing

Authors: Junjie Luo, Hendrik Molter, André Nichterlein, and Rolf Niedermeier

Published in: LIPIcs, Volume 122, 38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018)


Abstract
We introduce a dynamic version of the NP-hard Cluster Editing problem. The essential point here is to take into account dynamically evolving input graphs: Having a cluster graph (that is, a disjoint union of cliques) that represents a solution for a first input graph, can we cost-efficiently transform it into a "similar" cluster graph that is a solution for a second ("subsequent") input graph? This model is motivated by several application scenarios, including incremental clustering, the search for compromise clusterings, or also local search in graph-based data clustering. We thoroughly study six problem variants (edge editing, edge deletion, edge insertion; each combined with two distance measures between cluster graphs). We obtain both fixed-parameter tractability as well as parameterized hardness results, thus (except for two open questions) providing a fairly complete picture of the parameterized computational complexity landscape under the perhaps two most natural parameterizations: the distance of the new "similar" cluster graph to (i) the second input graph and to (ii) the input cluster graph.

Cite as

Junjie Luo, Hendrik Molter, André Nichterlein, and Rolf Niedermeier. Parameterized Dynamic Cluster Editing. In 38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 122, pp. 46:1-46:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{luo_et_al:LIPIcs.FSTTCS.2018.46,
  author =	{Luo, Junjie and Molter, Hendrik and Nichterlein, Andr\'{e} and Niedermeier, Rolf},
  title =	{{Parameterized Dynamic Cluster Editing}},
  booktitle =	{38th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2018)},
  pages =	{46:1--46:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-093-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{122},
  editor =	{Ganguly, Sumit and Pandya, Paritosh},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2018.46},
  URN =		{urn:nbn:de:0030-drops-99450},
  doi =		{10.4230/LIPIcs.FSTTCS.2018.46},
  annote =	{Keywords: graph-based data clustering, goal-oriented clustering, compromise clustering, NP-hard problems, fixed-parameter tractability, parameterized hardness}
}
Document
Data Reduction for Maximum Matching on Real-World Graphs: Theory and Experiments

Authors: Viatcheslav Korenwein, André Nichterlein, Rolf Niedermeier, and Philipp Zschoche

Published in: LIPIcs, Volume 112, 26th Annual European Symposium on Algorithms (ESA 2018)


Abstract
Finding a maximum-cardinality or maximum-weight matching in (edge-weighted) undirected graphs is among the most prominent problems of algorithmic graph theory. For n-vertex and m-edge graphs, the best known algorithms run in O~(m sqrt{n}) time. We build on recent theoretical work focusing on linear-time data reduction rules for finding maximum-cardinality matchings and complement the theoretical results by presenting and analyzing (thereby employing the kernelization methodology of parameterized complexity analysis) linear-time data reduction rules for the positive-integer-weighted case. Moreover, we experimentally demonstrate that these data reduction rules provide significant speedups of the state-of-the art implementation for computing matchings in real-world graphs: the average speedup is 3800% in the unweighted case and "just" 30% in the weighted case.

Cite as

Viatcheslav Korenwein, André Nichterlein, Rolf Niedermeier, and Philipp Zschoche. Data Reduction for Maximum Matching on Real-World Graphs: Theory and Experiments. In 26th Annual European Symposium on Algorithms (ESA 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 112, pp. 53:1-53:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{korenwein_et_al:LIPIcs.ESA.2018.53,
  author =	{Korenwein, Viatcheslav and Nichterlein, Andr\'{e} and Niedermeier, Rolf and Zschoche, Philipp},
  title =	{{Data Reduction for Maximum Matching on Real-World Graphs: Theory and Experiments}},
  booktitle =	{26th Annual European Symposium on Algorithms (ESA 2018)},
  pages =	{53:1--53:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-081-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{112},
  editor =	{Azar, Yossi and Bast, Hannah and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2018.53},
  URN =		{urn:nbn:de:0030-drops-95169},
  doi =		{10.4230/LIPIcs.ESA.2018.53},
  annote =	{Keywords: Maximum-cardinality matching, maximum-weight matching, linear-time algorithms, preprocessing, kernelization, parameterized complexity analysis}
}
Document
The Power of Linear-Time Data Reduction for Maximum Matching

Authors: George B. Mertzios, André Nichterlein, and Rolf Niedermeier

Published in: LIPIcs, Volume 83, 42nd International Symposium on Mathematical Foundations of Computer Science (MFCS 2017)


Abstract
Finding maximum-cardinality matchings in undirected graphs is arguably one of the most central graph primitives. For m-edge and n-vertex graphs, it is well-known to be solvable in O(m\sqrt{n}) time; however, for several applications this running time is still too slow. We investigate how linear-time (and almost linear-time) data reduction (used as preprocessing) can alleviate the situation. More specifically, we focus on linear-time kernelization. We start a deeper and systematic study both for general graphs and for bipartite graphs. Our data reduction algorithms easily comply (in form of preprocessing) with every solution strategy (exact, approximate, heuristic), thus making them attractive in various settings.

Cite as

George B. Mertzios, André Nichterlein, and Rolf Niedermeier. The Power of Linear-Time Data Reduction for Maximum Matching. In 42nd International Symposium on Mathematical Foundations of Computer Science (MFCS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 83, pp. 46:1-46:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{mertzios_et_al:LIPIcs.MFCS.2017.46,
  author =	{Mertzios, George B. and Nichterlein, Andr\'{e} and Niedermeier, Rolf},
  title =	{{The Power of Linear-Time Data Reduction for Maximum Matching}},
  booktitle =	{42nd International Symposium on Mathematical Foundations of Computer Science (MFCS 2017)},
  pages =	{46:1--46:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-046-0},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{83},
  editor =	{Larsen, Kim G. and Bodlaender, Hans L. and Raskin, Jean-Francois},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2017.46},
  URN =		{urn:nbn:de:0030-drops-81166},
  doi =		{10.4230/LIPIcs.MFCS.2017.46},
  annote =	{Keywords: Maximum-cardinality matching, bipartite graphs, linear-time algorithms, kernelization, parameterized complexity analysis, FPT in P}
}
Document
A Parameterized Algorithmics Framework for Degree Sequence Completion Problems in Directed Graphs

Authors: Robert Bredereck, Vincent Froese, Marcel Koseler, Marcelo Garlet Millani, André Nichterlein, and Rolf Niedermeier

Published in: LIPIcs, Volume 63, 11th International Symposium on Parameterized and Exact Computation (IPEC 2016)


Abstract
There has been intensive work on the parameterized complexity of the typically NP-hard task to edit undirected graphs into graphs fulfilling certain given vertex degree constraints. In this work, we lift the investigations to the case of directed graphs; herein, we focus on arc insertions. To this end, our general two-stage framework consists of efficiently solving a problem-specific number problem transferring its solution to a solution for the graph problem by applying flow computations. In this way, we obtain fixed-parameter tractability and polynomial kernelizability results, with the central parameter being the maximum vertex in- or outdegree of the output digraph. Although there are certain similarities with the much better studied undirected case, the flow computation used in the directed case seems not to work for the undirected case while f-factor computations as used in the undirected case seem not to work for the directed case.

Cite as

Robert Bredereck, Vincent Froese, Marcel Koseler, Marcelo Garlet Millani, André Nichterlein, and Rolf Niedermeier. A Parameterized Algorithmics Framework for Degree Sequence Completion Problems in Directed Graphs. In 11th International Symposium on Parameterized and Exact Computation (IPEC 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 63, pp. 10:1-10:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{bredereck_et_al:LIPIcs.IPEC.2016.10,
  author =	{Bredereck, Robert and Froese, Vincent and Koseler, Marcel and Millani, Marcelo Garlet and Nichterlein, Andr\'{e} and Niedermeier, Rolf},
  title =	{{A Parameterized Algorithmics Framework for Degree Sequence Completion Problems in Directed Graphs}},
  booktitle =	{11th International Symposium on Parameterized and Exact Computation (IPEC 2016)},
  pages =	{10:1--10:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-023-1},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{63},
  editor =	{Guo, Jiong and Hermelin, Danny},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2016.10},
  URN =		{urn:nbn:de:0030-drops-69353},
  doi =		{10.4230/LIPIcs.IPEC.2016.10},
  annote =	{Keywords: NP-hard graph problem, graph realizability, graph modification, arc insertion, fixed-parameter tractability, kernelization}
}
Document
Fractals for Kernelization Lower Bounds, With an Application to Length-Bounded Cut Problems

Authors: Till Fluschnik, Danny Hermelin, André Nichterlein, and Rolf Niedermeier

Published in: LIPIcs, Volume 55, 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)


Abstract
Bodlaender et al.'s [Bodlaender/Jansen/Kratsch,2014] cross-composition technique is a popular method for excluding polynomial-size problem kernels for NP-hard parameterized problems. We present a new technique exploiting triangle-based fractal structures for extending the range of applicability of cross-compositions. Our technique makes it possible to prove new no-polynomial-kernel results for a number of problems dealing with length-bounded cuts. Roughly speaking, our new technique combines the advantages of serial and parallel composition. In particular, answering an open question of Golovach and Thilikos [Golovach/Thilikos,2011], we show that, unless NP subseteq coNP/poly, the NP-hard Length-Bounded Edge-Cut problem (delete at most k edges such that the resulting graph has no s-t path of length shorter than l) parameterized by the combination of k and l has no polynomial-size problem kernel. Our framework applies to planar as well as directed variants of the basic problems and also applies to both edge and vertex deletion problems.

Cite as

Till Fluschnik, Danny Hermelin, André Nichterlein, and Rolf Niedermeier. Fractals for Kernelization Lower Bounds, With an Application to Length-Bounded Cut Problems. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 25:1-25:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{fluschnik_et_al:LIPIcs.ICALP.2016.25,
  author =	{Fluschnik, Till and Hermelin, Danny and Nichterlein, Andr\'{e} and Niedermeier, Rolf},
  title =	{{Fractals for Kernelization Lower Bounds, With an Application to Length-Bounded Cut Problems}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{25:1--25:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Chatzigiannakis, Ioannis and Mitzenmacher, Michael and Rabani, Yuval and Sangiorgi, Davide},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2016.25},
  URN =		{urn:nbn:de:0030-drops-63049},
  doi =		{10.4230/LIPIcs.ICALP.2016.25},
  annote =	{Keywords: Parameterized complexity, polynomial-time data reduction, cross-compositions, lower bounds, graph modification problems, interdiction problems}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail