23 Search Results for "Bart, Hans-J�rg"


Document
Single-Exponential FPT Algorithms for Enumerating Secluded ℱ-Free Subgraphs and Deleting to Scattered Graph Classes

Authors: Bart M. P. Jansen, Jari J. H. de Kroon, and Michał Włodarczyk

Published in: LIPIcs, Volume 283, 34th International Symposium on Algorithms and Computation (ISAAC 2023)


Abstract
The celebrated notion of important separators bounds the number of small (S,T)-separators in a graph which are "farthest from S" in a technical sense. In this paper, we introduce a generalization of this powerful algorithmic primitive, tailored to undirected graphs, that is phrased in terms of k-secluded vertex sets: sets with an open neighborhood of size at most k. In this terminology, the bound on important separators says that there are at most 4^k maximal k-secluded connected vertex sets C containing S but disjoint from T. We generalize this statement significantly: even when we demand that G[C] avoids a finite set ℱ of forbidden induced subgraphs, the number of such maximal subgraphs is 2^𝒪(k) and they can be enumerated efficiently. This enumeration algorithm allows us to make significant improvements for two problems from the literature. Our first application concerns the Connected k-Secluded ℱ-free subgraph problem, where ℱ is a finite set of forbidden induced subgraphs. Given a graph in which each vertex has a positive integer weight, the problem asks to find a maximum-weight connected k-secluded vertex set C ⊆ V(G) such that G[C] does not contain an induced subgraph isomorphic to any F ∈ ℱ. The parameterization by k is known to be solvable in triple-exponential time via the technique of recursive understanding, which we improve to single-exponential. Our second application concerns the deletion problem to scattered graph classes. A scattered graph class is defined by demanding that every connected component is contained in at least one of the prescribed graph classes Π_1, …, Π_d. The deletion problem to a scattered graph class is to find a vertex set of size at most k whose removal yields a graph from the class. We obtain a single-exponential algorithm whenever each class Π_i is characterized by a finite number of forbidden induced subgraphs. This generalizes and improves upon earlier results in the literature.

Cite as

Bart M. P. Jansen, Jari J. H. de Kroon, and Michał Włodarczyk. Single-Exponential FPT Algorithms for Enumerating Secluded ℱ-Free Subgraphs and Deleting to Scattered Graph Classes. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 42:1-42:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.ISAAC.2023.42,
  author =	{Jansen, Bart M. P. and de Kroon, Jari J. H. and W{\l}odarczyk, Micha{\l}},
  title =	{{Single-Exponential FPT Algorithms for Enumerating Secluded ℱ-Free Subgraphs and Deleting to Scattered Graph Classes}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{42:1--42:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.42},
  URN =		{urn:nbn:de:0030-drops-193440},
  doi =		{10.4230/LIPIcs.ISAAC.2023.42},
  annote =	{Keywords: fixed-parameter tractability, important separators, secluded subgraphs}
}
Document
5-Approximation for ℋ-Treewidth Essentially as Fast as ℋ-Deletion Parameterized by Solution Size

Authors: Bart M. P. Jansen, Jari J. H. de Kroon, and Michał Włodarczyk

Published in: LIPIcs, Volume 274, 31st Annual European Symposium on Algorithms (ESA 2023)


Abstract
The notion of ℋ-treewidth, where ℋ is a hereditary graph class, was recently introduced as a generalization of the treewidth of an undirected graph. Roughly speaking, a graph of ℋ-treewidth at most k can be decomposed into (arbitrarily large) ℋ-subgraphs which interact only through vertex sets of size 𝒪(k) which can be organized in a tree-like fashion. ℋ-treewidth can be used as a hybrid parameterization to develop fixed-parameter tractable algorithms for ℋ-deletion problems, which ask to find a minimum vertex set whose removal from a given graph G turns it into a member of ℋ. The bottleneck in the current parameterized algorithms lies in the computation of suitable tree ℋ-decompositions. We present FPT-approximation algorithms to compute tree ℋ-decompositions for hereditary and union-closed graph classes ℋ. Given a graph of ℋ-treewidth k, we can compute a 5-approximate tree ℋ-decomposition in time f(𝒪(k)) ⋅ n^𝒪(1) whenever ℋ-deletion parameterized by solution size can be solved in time f(k) ⋅ n^𝒪(1) for some function f(k) ≥ 2^k. The current-best algorithms either achieve an approximation factor of k^𝒪(1) or construct optimal decompositions while suffering from non-uniformity with unknown parameter dependence. Using these decompositions, we obtain algorithms solving Odd Cycle Transversal in time 2^𝒪(k) ⋅ n^𝒪(1) parameterized by bipartite-treewidth and Vertex Planarization in time 2^𝒪(k log k) ⋅ n^𝒪(1) parameterized by planar-treewidth, showing that these can be as fast as the solution-size parameterizations and giving the first ETH-tight algorithms for parameterizations by hybrid width measures.

Cite as

Bart M. P. Jansen, Jari J. H. de Kroon, and Michał Włodarczyk. 5-Approximation for ℋ-Treewidth Essentially as Fast as ℋ-Deletion Parameterized by Solution Size. In 31st Annual European Symposium on Algorithms (ESA 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 274, pp. 66:1-66:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.ESA.2023.66,
  author =	{Jansen, Bart M. P. and de Kroon, Jari J. H. and W{\l}odarczyk, Micha{\l}},
  title =	{{5-Approximation for ℋ-Treewidth Essentially as Fast as ℋ-Deletion Parameterized by Solution Size}},
  booktitle =	{31st Annual European Symposium on Algorithms (ESA 2023)},
  pages =	{66:1--66:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-295-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{274},
  editor =	{G{\o}rtz, Inge Li and Farach-Colton, Martin and Puglisi, Simon J. and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2023.66},
  URN =		{urn:nbn:de:0030-drops-187195},
  doi =		{10.4230/LIPIcs.ESA.2023.66},
  annote =	{Keywords: fixed-parameter tractability, treewidth, graph decompositions}
}
Document
Search-Space Reduction via Essential Vertices

Authors: Benjamin Merlin Bumpus, Bart M. P. Jansen, and Jari J. H. de Kroon

Published in: LIPIcs, Volume 244, 30th Annual European Symposium on Algorithms (ESA 2022)


Abstract
We investigate preprocessing for vertex-subset problems on graphs. While the notion of kernelization, originating in parameterized complexity theory, is a formalization of provably effective preprocessing aimed at reducing the total instance size, our focus is on finding a non-empty vertex set that belongs to an optimal solution. This decreases the size of the remaining part of the solution which still has to be found, and therefore shrinks the search space of fixed-parameter tractable algorithms for parameterizations based on the solution size. We introduce the notion of a c-essential vertex as one that is contained in all c-approximate solutions. For several classic combinatorial problems such as Odd Cycle Transversal and Directed Feedback Vertex Set, we show that under mild conditions a polynomial-time preprocessing algorithm can find a subset of an optimal solution that contains all 2-essential vertices, by exploiting packing/covering duality. This leads to FPT algorithms to solve these problems where the exponential term in the running time depends only on the number of non-essential vertices in the solution.

Cite as

Benjamin Merlin Bumpus, Bart M. P. Jansen, and Jari J. H. de Kroon. Search-Space Reduction via Essential Vertices. In 30th Annual European Symposium on Algorithms (ESA 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 244, pp. 30:1-30:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bumpus_et_al:LIPIcs.ESA.2022.30,
  author =	{Bumpus, Benjamin Merlin and Jansen, Bart M. P. and de Kroon, Jari J. H.},
  title =	{{Search-Space Reduction via Essential Vertices}},
  booktitle =	{30th Annual European Symposium on Algorithms (ESA 2022)},
  pages =	{30:1--30:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-247-1},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{244},
  editor =	{Chechik, Shiri and Navarro, Gonzalo and Rotenberg, Eva and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2022.30},
  URN =		{urn:nbn:de:0030-drops-169687},
  doi =		{10.4230/LIPIcs.ESA.2022.30},
  annote =	{Keywords: fixed-parameter tractability, essential vertices, covering versus packing}
}
Document
On the Hardness of Compressing Weights

Authors: Bart M. P. Jansen, Shivesh K. Roy, and Michał Włodarczyk

Published in: LIPIcs, Volume 202, 46th International Symposium on Mathematical Foundations of Computer Science (MFCS 2021)


Abstract
We investigate computational problems involving large weights through the lens of kernelization, which is a framework of polynomial-time preprocessing aimed at compressing the instance size. Our main focus is the weighted Clique problem, where we are given an edge-weighted graph and the goal is to detect a clique of total weight equal to a prescribed value. We show that the weighted variant, parameterized by the number of vertices n, is significantly harder than the unweighted problem by presenting an 𝒪(n^{3 - ε}) lower bound on the size of the kernel, under the assumption that NP ̸ ⊆ coNP/poly. This lower bound is essentially tight: we show that we can reduce the problem to the case with weights bounded by 2^𝒪(n), which yields a randomized kernel of 𝒪(n³) bits. We generalize these results to the weighted d-Uniform Hyperclique problem, Subset Sum, and weighted variants of Boolean Constraint Satisfaction Problems (CSPs). We also study weighted minimization problems and show that weight compression is easier when we only want to {preserve the collection of} optimal solutions. Namely, we show that for node-weighted Vertex Cover on bipartite graphs it is possible to maintain the set of optimal solutions using integer weights from the range [1, n], but if we want to maintain the ordering of the weights of all inclusion-minimal solutions, then weights as large as 2^Ω(n) are necessary.

Cite as

Bart M. P. Jansen, Shivesh K. Roy, and Michał Włodarczyk. On the Hardness of Compressing Weights. In 46th International Symposium on Mathematical Foundations of Computer Science (MFCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 202, pp. 64:1-64:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.MFCS.2021.64,
  author =	{Jansen, Bart M. P. and Roy, Shivesh K. and W{\l}odarczyk, Micha{\l}},
  title =	{{On the Hardness of Compressing Weights}},
  booktitle =	{46th International Symposium on Mathematical Foundations of Computer Science (MFCS 2021)},
  pages =	{64:1--64:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-201-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{202},
  editor =	{Bonchi, Filippo and Puglisi, Simon J.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2021.64},
  URN =		{urn:nbn:de:0030-drops-145049},
  doi =		{10.4230/LIPIcs.MFCS.2021.64},
  annote =	{Keywords: kernelization, compression, edge-weighted clique, constraint satisfaction problems}
}
Document
Preprocessing Vertex-Deletion Problems: Characterizing Graph Properties by Low-Rank Adjacencies

Authors: Bart M. P. Jansen and Jari J. H. de Kroon

Published in: LIPIcs, Volume 162, 17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020)


Abstract
We consider the Π-free Deletion problem parameterized by the size of a vertex cover, for a range of graph properties Π. Given an input graph G, this problem asks whether there is a subset of at most k vertices whose removal ensures the resulting graph does not contain a graph from Π as induced subgraph. Many vertex-deletion problems such as Perfect Deletion, Wheel-free Deletion, and Interval Deletion fit into this framework. We introduce the concept of characterizing a graph property Π by low-rank adjacencies, and use it as the cornerstone of a general kernelization theorem for Π-Free Deletion parameterized by the size of a vertex cover. The resulting framework captures problems such as AT-Free Deletion, Wheel-free Deletion, and Interval Deletion. Moreover, our new framework shows that the vertex-deletion problem to perfect graphs has a polynomial kernel when parameterized by vertex cover, thereby resolving an open question by Fomin et al. [JCSS 2014]. Our main technical contribution shows how linear-algebraic dependence of suitably defined vectors over 𝔽₂ implies graph-theoretic statements about the presence of forbidden induced subgraphs.

Cite as

Bart M. P. Jansen and Jari J. H. de Kroon. Preprocessing Vertex-Deletion Problems: Characterizing Graph Properties by Low-Rank Adjacencies. In 17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 162, pp. 27:1-27:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.SWAT.2020.27,
  author =	{Jansen, Bart M. P. and de Kroon, Jari J. H.},
  title =	{{Preprocessing Vertex-Deletion Problems: Characterizing Graph Properties by Low-Rank Adjacencies}},
  booktitle =	{17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020)},
  pages =	{27:1--27:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-150-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{162},
  editor =	{Albers, Susanne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SWAT.2020.27},
  URN =		{urn:nbn:de:0030-drops-122748},
  doi =		{10.4230/LIPIcs.SWAT.2020.27},
  annote =	{Keywords: kernelization, vertex-deletion, graph modification, structural parameterization}
}
Document
Executable First-Order Queries in the Logic of Information Flows

Authors: Heba Aamer, Bart Bogaerts, Dimitri Surinx, Eugenia Ternovska, and Jan Van den Bussche

Published in: LIPIcs, Volume 155, 23rd International Conference on Database Theory (ICDT 2020)


Abstract
The logic of information flows (LIF) has recently been proposed as a general framework in the field of knowledge representation. In this framework, tasks of a procedural nature can still be modeled in a declarative, logic-based fashion. In this paper, we focus on the task of query processing under limited access patterns, a well-studied problem in the database literature. We show that LIF is well-suited for modeling this task. Toward this goal, we introduce a variant of LIF called "forward" LIF, in a first-order setting. We define FLIF^io, a syntactical fragment of forward LIF, and show that it corresponds exactly to the "executable" fragment of first-order logic defined by Nash and Ludäscher. The definition of FLIF^io involves a classification of the free variables of an expression into "input" and "output" variables. Our result hinges on inertia and determinacy laws for forward LIF expressions, which are interesting in their own right. These laws are formulated in terms of the input and output variables.

Cite as

Heba Aamer, Bart Bogaerts, Dimitri Surinx, Eugenia Ternovska, and Jan Van den Bussche. Executable First-Order Queries in the Logic of Information Flows. In 23rd International Conference on Database Theory (ICDT 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 155, pp. 4:1-4:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{aamer_et_al:LIPIcs.ICDT.2020.4,
  author =	{Aamer, Heba and Bogaerts, Bart and Surinx, Dimitri and Ternovska, Eugenia and Van den Bussche, Jan},
  title =	{{Executable First-Order Queries in the Logic of Information Flows}},
  booktitle =	{23rd International Conference on Database Theory (ICDT 2020)},
  pages =	{4:1--4:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-139-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{155},
  editor =	{Lutz, Carsten and Jung, Jean Christoph},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2020.4},
  URN =		{urn:nbn:de:0030-drops-119284},
  doi =		{10.4230/LIPIcs.ICDT.2020.4},
  annote =	{Keywords: Logic of Information Flows, Limited access pattern, Executable first-order logic}
}
Document
Metric Dimension Parameterized by Treewidth

Authors: Édouard Bonnet and Nidhi Purohit

Published in: LIPIcs, Volume 148, 14th International Symposium on Parameterized and Exact Computation (IPEC 2019)


Abstract
A resolving set S of a graph G is a subset of its vertices such that no two vertices of G have the same distance vector to S. The Metric Dimension problem asks for a resolving set of minimum size, and in its decision form, a resolving set of size at most some specified integer. This problem is NP-complete, and remains so in very restricted classes of graphs. It is also W[2]-complete with respect to the size of the solution. Metric Dimension has proven elusive on graphs of bounded treewidth. On the algorithmic side, a polytime algorithm is known for trees, and even for outerplanar graphs, but the general case of treewidth at most two is open. On the complexity side, no parameterized hardness is known. This has led several papers on the topic to ask for the parameterized complexity of Metric Dimension with respect to treewidth. We provide a first answer to the question. We show that Metric Dimension parameterized by the treewidth of the input graph is W[1]-hard. More refinedly we prove that, unless the Exponential Time Hypothesis fails, there is no algorithm solving Metric Dimension in time f(pw)n^{o(pw)} on n-vertex graphs of constant degree, with pw the pathwidth of the input graph, and f any computable function. This is in stark contrast with an FPT algorithm of Belmonte et al. [SIAM J. Discrete Math. '17] with respect to the combined parameter tl+Delta, where tl is the tree-length and Delta the maximum-degree of the input graph.

Cite as

Édouard Bonnet and Nidhi Purohit. Metric Dimension Parameterized by Treewidth. In 14th International Symposium on Parameterized and Exact Computation (IPEC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 148, pp. 5:1-5:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{bonnet_et_al:LIPIcs.IPEC.2019.5,
  author =	{Bonnet, \'{E}douard and Purohit, Nidhi},
  title =	{{Metric Dimension Parameterized by Treewidth}},
  booktitle =	{14th International Symposium on Parameterized and Exact Computation (IPEC 2019)},
  pages =	{5:1--5:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-129-0},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{148},
  editor =	{Jansen, Bart M. P. and Telle, Jan Arne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2019.5},
  URN =		{urn:nbn:de:0030-drops-114668},
  doi =		{10.4230/LIPIcs.IPEC.2019.5},
  annote =	{Keywords: Metric Dimension, Treewidth, Parameterized Hardness}
}
Document
On the Fine-Grained Complexity of Least Weight Subsequence in Multitrees and Bounded Treewidth DAGs

Authors: Jiawei Gao

Published in: LIPIcs, Volume 148, 14th International Symposium on Parameterized and Exact Computation (IPEC 2019)


Abstract
This paper introduces a new technique that generalizes previously known fine-grained reductions from linear structures to graphs. Least Weight Subsequence (LWS) [Hirschberg and Larmore, 1987] is a class of highly sequential optimization problems with form F(j) = min_{i < j} [F(i) + c_{i,j}] . They can be solved in quadratic time using dynamic programming, but it is not known whether these problems can be solved faster than n^{2-o(1)} time. Surprisingly, each such problem is subquadratic time reducible to a highly parallel, non-dynamic programming problem [Marvin Künnemann et al., 2017]. In other words, if a "static" problem is faster than quadratic time, so is an LWS problem. For many instances of LWS, the sequential versions are equivalent to their static versions by subquadratic time reductions. The previous result applies to LWS on linear structures, and this paper extends this result to LWS on paths in sparse graphs, the Least Weight Subpath (LWSP) problems. When the graph is a multitree (i.e. a DAG where any pair of vertices can have at most one path) or when the graph is a DAG whose underlying undirected graph has constant treewidth, we show that LWSP on this graph is still subquadratically reducible to their corresponding static problems. For many instances, the graph versions are still equivalent to their static versions. Moreover, this paper shows that if we can decide a property of form Exists x Exists y P(x,y) in subquadratic time, where P is a quickly checkable property on a pair of elements, then on these classes of graphs, we can also in subquadratic time decide whether there exists a pair x,y in the transitive closure of the graph that also satisfy P(x,y).

Cite as

Jiawei Gao. On the Fine-Grained Complexity of Least Weight Subsequence in Multitrees and Bounded Treewidth DAGs. In 14th International Symposium on Parameterized and Exact Computation (IPEC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 148, pp. 16:1-16:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{gao:LIPIcs.IPEC.2019.16,
  author =	{Gao, Jiawei},
  title =	{{On the Fine-Grained Complexity of Least Weight Subsequence in Multitrees and Bounded Treewidth DAGs}},
  booktitle =	{14th International Symposium on Parameterized and Exact Computation (IPEC 2019)},
  pages =	{16:1--16:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-129-0},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{148},
  editor =	{Jansen, Bart M. P. and Telle, Jan Arne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2019.16},
  URN =		{urn:nbn:de:0030-drops-114778},
  doi =		{10.4230/LIPIcs.IPEC.2019.16},
  annote =	{Keywords: fine-grained complexity, dynamic programming, graph reachability}
}
Document
A Formal Semantics of Influence in Bayesian Reasoning

Authors: Bart Jacobs and Fabio Zanasi

Published in: LIPIcs, Volume 83, 42nd International Symposium on Mathematical Foundations of Computer Science (MFCS 2017)


Abstract
This paper proposes a formal definition of influence in Bayesian reasoning, based on the notions of state (as probability distribution), predicate, validity and conditioning. Our approach highlights how conditioning a joint entwined/entangled state with a predicate on one of its components has 'crossover' influence on the other components. We use the total variation metric on probability distributions to quantitatively measure such influence. These insights are applied to give a rigorous explanation of the fundamental concept of d-separation in Bayesian networks.

Cite as

Bart Jacobs and Fabio Zanasi. A Formal Semantics of Influence in Bayesian Reasoning. In 42nd International Symposium on Mathematical Foundations of Computer Science (MFCS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 83, pp. 21:1-21:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{jacobs_et_al:LIPIcs.MFCS.2017.21,
  author =	{Jacobs, Bart and Zanasi, Fabio},
  title =	{{A Formal Semantics of Influence in Bayesian Reasoning}},
  booktitle =	{42nd International Symposium on Mathematical Foundations of Computer Science (MFCS 2017)},
  pages =	{21:1--21:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-046-0},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{83},
  editor =	{Larsen, Kim G. and Bodlaender, Hans L. and Raskin, Jean-Francois},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2017.21},
  URN =		{urn:nbn:de:0030-drops-80896},
  doi =		{10.4230/LIPIcs.MFCS.2017.21},
  annote =	{Keywords: probability distribution, Bayesian network, influence}
}
Document
Lower Bounds for Protrusion Replacement by Counting Equivalence Classes

Authors: Bart M. P. Jansen and Jules J. H. M. Wulms

Published in: LIPIcs, Volume 63, 11th International Symposium on Parameterized and Exact Computation (IPEC 2016)


Abstract
Garnero et al. [SIAM J. Discrete Math. 2015, 29(4):1864-1894] recently introduced a framework based on dynamic programming to make applications of the protrusion replacement technique constructive and to obtain explicit upper bounds on the involved constants. They show that for several graph problems, for every boundary size t one can find an explicit set R_t of representatives. Any subgraph H with a boundary of size t can be replaced with a representative H' in R_t such that the effect of this replacement on the optimum can be deduced from H and H' alone. Their upper bounds on the size of the graphs in R_t grow triple-exponentially with t. In this paper we complement their results by lower bounds on the sizes of representatives, in terms of the boundary size t. For example, we show that each set of planar representatives R_t for the Independent Set problem contains a graph with Omega(2^t / sqrt{4t}) vertices. This lower bound even holds for sets that only represent the planar subgraphs of bounded pathwidth. To obtain our results we provide a lower bound on the number of equivalence classes of the canonical equivalence relation for Independent Set on t-boundaried graphs. We also find an elegant characterization of the number of equivalence classes in general graphs, in terms of the number of monotone functions of a certain kind. Our results show that the number of equivalence classes is at most 2^{2^t}, improving on earlier bounds of the form (t+1)^{2^t}.

Cite as

Bart M. P. Jansen and Jules J. H. M. Wulms. Lower Bounds for Protrusion Replacement by Counting Equivalence Classes. In 11th International Symposium on Parameterized and Exact Computation (IPEC 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 63, pp. 17:1-17:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.IPEC.2016.17,
  author =	{Jansen, Bart M. P. and Wulms, Jules J. H. M.},
  title =	{{Lower Bounds for Protrusion Replacement by Counting Equivalence Classes}},
  booktitle =	{11th International Symposium on Parameterized and Exact Computation (IPEC 2016)},
  pages =	{17:1--17:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-023-1},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{63},
  editor =	{Guo, Jiong and Hermelin, Danny},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2016.17},
  URN =		{urn:nbn:de:0030-drops-69275},
  doi =		{10.4230/LIPIcs.IPEC.2016.17},
  annote =	{Keywords: protrusions, boundaried graphs, independent set, equivalence classes, finite integer index}
}
Document
CFD Simulation of Liquid-Liquid Extraction Columns and Visualization of Eulerian Datasets

Authors: Mark W. Hlawitschka, Fang Chen, Hans-Jörg Bart, and Bernd Hamann

Published in: OASIcs, Volume 27, Visualization of Large and Unstructured Data Sets: Applications in Geospatial Planning, Modeling and Engineering - Proceedings of IRTG 1131 Workshop 2011


Abstract
In this joint work, a complete framework for modeling, simulating and visualizing multiphase fluid flow within an extraction column is presented. We first present a volume-of-fluid simulation, which is able to predict the surface of the droplets during coalescence. However, a fast and efficient model is needed for the simulation of a liquid-liquid extraction column due to the high number of occurring droplets. To simulate the velocity and droplet size in a DN32 extraction column, a coupled computational fluid dynamic-population balance model solver is used. The simulation is analyzed using path-line based visualization techniques. A novel semi-automatic re-seeding technique for droplet path-line integration is proposed. With our technique, path-lines of fluid droplets can be re-initialized after contact with the stirring devices. The droplet breakage is captured, allowing the engineer to improve the design of liquid-liquid columns layout.

Cite as

Mark W. Hlawitschka, Fang Chen, Hans-Jörg Bart, and Bernd Hamann. CFD Simulation of Liquid-Liquid Extraction Columns and Visualization of Eulerian Datasets. In Visualization of Large and Unstructured Data Sets: Applications in Geospatial Planning, Modeling and Engineering - Proceedings of IRTG 1131 Workshop 2011. Open Access Series in Informatics (OASIcs), Volume 27, pp. 59-70, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{hlawitschka_et_al:OASIcs.VLUDS.2011.59,
  author =	{Hlawitschka, Mark W. and Chen, Fang and Bart, Hans-J\"{o}rg and Hamann, Bernd},
  title =	{{CFD Simulation of Liquid-Liquid Extraction Columns and Visualization of Eulerian Datasets}},
  booktitle =	{Visualization of Large and Unstructured Data Sets: Applications in Geospatial Planning, Modeling and Engineering - Proceedings of IRTG 1131 Workshop 2011},
  pages =	{59--70},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-46-0},
  ISSN =	{2190-6807},
  year =	{2012},
  volume =	{27},
  editor =	{Garth, Christoph and Middel, Ariane and Hagen, Hans},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.VLUDS.2011.59},
  URN =		{urn:nbn:de:0030-drops-37410},
  doi =		{10.4230/OASIcs.VLUDS.2011.59},
  annote =	{Keywords: computational fluid dynamics, multiphase fluid, droplet collision, Eule- rian, path-line}
}
Document
Cross-Composition: A New Technique for Kernelization Lower Bounds

Authors: Hans L. Bodlaender, Bart M. P. Jansen, and Stefan Kratsch

Published in: LIPIcs, Volume 9, 28th International Symposium on Theoretical Aspects of Computer Science (STACS 2011)


Abstract
We introduce a new technique for proving kernelization lower bounds, called cross-composition. A classical problem L cross-composes into a parameterized problem $Q$ if an instance of Q with polynomially bounded parameter value can express the logical OR of a sequence of instances of L. Building on work by Bodlaender et al. (ICALP 2008) and using a result by Fortnow and Santhanam (STOC 2008) we show that if an NP-hard problem cross-composes into a parameterized problem Q then Q does not admit a polynomial kernel unless the polynomial hierarchy collapses. Our technique generalizes and strengthens the recent techniques of using OR-composition algorithms and of transferring the lower bounds via polynomial parameter transformations. We show its applicability by proving kernelization lower bounds for a number of important graphs problems with structural (non-standard) parameterizations, e.g., Chromatic Number, Clique, and Weighted Feedback Vertex Set do not admit polynomial kernels with respect to the vertex cover number of the input graphs unless the polynomial hierarchy collapses, contrasting the fact that these problems are trivially fixed-parameter tractable for this parameter. We have similar lower bounds for Feedback Vertex Set.

Cite as

Hans L. Bodlaender, Bart M. P. Jansen, and Stefan Kratsch. Cross-Composition: A New Technique for Kernelization Lower Bounds. In 28th International Symposium on Theoretical Aspects of Computer Science (STACS 2011). Leibniz International Proceedings in Informatics (LIPIcs), Volume 9, pp. 165-176, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)


Copy BibTex To Clipboard

@InProceedings{bodlaender_et_al:LIPIcs.STACS.2011.165,
  author =	{Bodlaender, Hans L. and Jansen, Bart M. P. and Kratsch, Stefan},
  title =	{{Cross-Composition: A New Technique for Kernelization Lower Bounds}},
  booktitle =	{28th International Symposium on Theoretical Aspects of Computer Science (STACS 2011)},
  pages =	{165--176},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-25-5},
  ISSN =	{1868-8969},
  year =	{2011},
  volume =	{9},
  editor =	{Schwentick, Thomas and D\"{u}rr, Christoph},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2011.165},
  URN =		{urn:nbn:de:0030-drops-30082},
  doi =		{10.4230/LIPIcs.STACS.2011.165},
  annote =	{Keywords: kernelization, lower bounds, parameterized complexity}
}
Document
Vertex Cover Kernelization Revisited: Upper and Lower Bounds for a Refined Parameter

Authors: Bart M. P. Jansen and Hans L. Bodlaender

Published in: LIPIcs, Volume 9, 28th International Symposium on Theoretical Aspects of Computer Science (STACS 2011)


Abstract
Kernelization is a concept that enables the formal mathematical analysis of data reduction through the framework of parameterized complexity. Intensive research into the Vertex Cover problem has shown that there is a preprocessing algorithm which given an instance (G,k) of Vertex Cover outputs an equivalent instance (G',k') in polynomial time with the guarantee that G' has at most 2k' vertices (and thus O((k')^2) edges) with k' <= k. Using the terminology of parameterized complexity we say that k-Vertex Cover has a kernel with 2k vertices. There is complexity-theoretic evidence that both 2k vertices and Theta(k^2) edges are optimal for the kernel size. In this paper we consider the Vertex Cover problem with a different parameter, the size fvs(G) of a minimum feedback vertex set for G. This refined parameter is structurally smaller than the parameter k associated to the vertex covering number VC(G) since fvs(G) <= VC(G) and the difference can be arbitrarily large. We give a kernel for Vertex Cover with a number of vertices that is cubic in fvs(G): an instance (G,X,k) of Vertex Cover, where X is a feedback vertex set for G, can be transformed in polynomial time into an equivalent instance (G',X',k') such that k' <= k, |X'| <= |X| and most importantly |V(G')| <= 2k and |V(G')| in O(|X'|^3). A similar result holds when the feedback vertex set X is not given along with the input. In sharp contrast we show that the Weighted Vertex Cover problem does not have polynomial kernel when parameterized by fvs(G) unless the polynomial hierarchy collapses to the third level (PH=Sigma_3^p). Our work is one of the first examples of research in kernelization using a non-standard parameter, and shows that this approach can yield interesting computational insights. To obtain our results we make extensive use of the combinatorial structure of independent sets in forests.

Cite as

Bart M. P. Jansen and Hans L. Bodlaender. Vertex Cover Kernelization Revisited: Upper and Lower Bounds for a Refined Parameter. In 28th International Symposium on Theoretical Aspects of Computer Science (STACS 2011). Leibniz International Proceedings in Informatics (LIPIcs), Volume 9, pp. 177-188, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2011)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.STACS.2011.177,
  author =	{Jansen, Bart M. P. and Bodlaender, Hans L.},
  title =	{{Vertex Cover Kernelization Revisited: Upper and Lower Bounds for a Refined Parameter}},
  booktitle =	{28th International Symposium on Theoretical Aspects of Computer Science (STACS 2011)},
  pages =	{177--188},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-25-5},
  ISSN =	{1868-8969},
  year =	{2011},
  volume =	{9},
  editor =	{Schwentick, Thomas and D\"{u}rr, Christoph},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2011.177},
  URN =		{urn:nbn:de:0030-drops-30097},
  doi =		{10.4230/LIPIcs.STACS.2011.177},
  annote =	{Keywords: kernelization, lower bounds, parameterized complexity}
}
Document
Propagating and measuring anchor uncertainty in space-time prisms on road networks

Authors: Bart Kuijpers, Harvey J. Miller, Tijs Neutens, and Walied Othman

Published in: Dagstuhl Seminar Proceedings, Volume 8471, Geographic Privacy-Aware Knowledge Discovery and Delivery (2009)


Abstract
Space-time prisms capture all possible spatio-temporal locations of a moving object between sample points given speed limit constraints on its movement. These sample points are usually considered to be perfect measurements. In this paper we restrict ourselves to a road network and extend the notion of sample points to sample regions, which are bounded, sometimes disconnected, subsets of space-time wherein each point is a possible location, with its respective probability, where a moving object could have originated from or arrived in. This model allows us to model measurement errors, multiple possible simultaneous locations and even flexibility of a moving object. We develop an algorithm that computes the envelope of all space-time prisms that have an anchor in these sample regions and we developed an algorithm that computes for any spatio-temporal point the probability with which a space-time prism, with anchors in these sample regions, contains that point. We implemented these algorithms in Mathematica to visualise all these newly-introduced concepts.

Cite as

Bart Kuijpers, Harvey J. Miller, Tijs Neutens, and Walied Othman. Propagating and measuring anchor uncertainty in space-time prisms on road networks. In Geographic Privacy-Aware Knowledge Discovery and Delivery. Dagstuhl Seminar Proceedings, Volume 8471, pp. 1-35, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{kuijpers_et_al:DagSemProc.08471.2,
  author =	{Kuijpers, Bart and Miller, Harvey J. and Neutens, Tijs and Othman, Walied},
  title =	{{Propagating and measuring anchor uncertainty in space-time prisms on road networks}},
  booktitle =	{Geographic Privacy-Aware Knowledge Discovery and Delivery},
  pages =	{1--35},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2009},
  volume =	{8471},
  editor =	{Bart Kuijpers and Dino Pedreschi and Yucel Saygin and Stefano Spaccapietra},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08471.2},
  URN =		{urn:nbn:de:0030-drops-20072},
  doi =		{10.4230/DagSemProc.08471.2},
  annote =	{Keywords: Space-time prisms, beads, prisms, uncertainty, flexibility, time-geography}
}
Document
The Road from Panama to Keccak via RadioGatún

Authors: Guido Bertoni, Joan Daemen, Michaël Peeters, and Gilles Van Assche

Published in: Dagstuhl Seminar Proceedings, Volume 9031, Symmetric Cryptography (2009)


Abstract
In this presentation, we explain the design choices of Panama [1] and RadioGatun [2], which lead to Keccak [3]. After a brief recall of Panama, RadioGatun and the trail backtracking cost, we focus on three important aspects. First, we explain the role of the belt in the light of differential trails. Second, we discuss the relative advantages of a block mode hash function compared to a stream mode one. Finally, we point out why Panama and RadioGatun are not sponge functions and why their design philosophy differs from that of Keccak. [1] J. Daemen and C. S. K. Clapp, FSE 1998 [2] G. Bertoni et al., NIST Hash Workshop 2006 [3] G. Bertoni et al., SHA-3 submission, 2008

Cite as

Guido Bertoni, Joan Daemen, Michaël Peeters, and Gilles Van Assche. The Road from Panama to Keccak via RadioGatún. In Symmetric Cryptography. Dagstuhl Seminar Proceedings, Volume 9031, pp. 1-9, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{bertoni_et_al:DagSemProc.09031.17,
  author =	{Bertoni, Guido and Daemen, Joan and Peeters, Micha\"{e}l and Van Assche, Gilles},
  title =	{{The Road from Panama to Keccak via RadioGat\'{u}n}},
  booktitle =	{Symmetric Cryptography},
  pages =	{1--9},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2009},
  volume =	{9031},
  editor =	{Helena Handschuh and Stefan Lucks and Bart Preneel and Phillip Rogaway},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.09031.17},
  URN =		{urn:nbn:de:0030-drops-19587},
  doi =		{10.4230/DagSemProc.09031.17},
  annote =	{Keywords: Hash function, cryptography}
}
  • Refine by Author
  • 8 Jansen, Bart M. P.
  • 4 Kuijpers, Bart
  • 4 de Kroon, Jari J. H.
  • 3 Włodarczyk, Michał
  • 2 Bank, Bernd
  • Show More...

  • Refine by Classification
  • 5 Theory of computation → Graph algorithms analysis
  • 4 Theory of computation → Parameterized complexity and exact algorithms
  • 2 Mathematics of computing → Graph algorithms
  • 2 Theory of computation → Fixed parameter tractability
  • 2 Theory of computation → Problems, reductions and completeness
  • Show More...

  • Refine by Keyword
  • 4 kernelization
  • 3 Constraint databases
  • 3 fixed-parameter tractability
  • 2 geographic information systems
  • 2 lower bounds
  • Show More...

  • Refine by Type
  • 23 document

  • Refine by Publication Year
  • 6 2007
  • 2 2008
  • 2 2009
  • 2 2011
  • 2 2017
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail