Search Results

Documents authored by Wlodarczyk, Michal


Document
On Problems Equivalent to (min,+)-Convolution

Authors: Marek Cygan, Marcin Mucha, Karol Wegrzycki, and Michal Wlodarczyk

Published in: LIPIcs, Volume 80, 44th International Colloquium on Automata, Languages, and Programming (ICALP 2017)


Abstract
In the recent years, significant progress has been made in explaining apparent hardness of improving over naive solutions for many fundamental polynomially solvable problems. This came in the form of conditional lower bounds -- reductions from a problem assumed to be hard. These include 3SUM, All-Pairs Shortest Paths, SAT and Orthogonal Vectors, and others. In the (min,+)-convolution problem, the goal is to compute a sequence c, where c[k] = min_i a[i]+b[k-i], given sequences a and b. This can easily be done in O(n^2) time, but no O(n^{2-eps}) algorithm is known for eps > 0. In this paper we undertake a systematic study of the (min,+)-convolution problem as a hardness assumption. As the first step, we establish equivalence of this problem to a group of other problems, including variants of the classic knapsack problem and problems related to subadditive sequences. The (min,+)-convolution has been used as a building block in algorithms for many problems, notably problems in stringology. It has also already appeared as an ad hoc hardness assumption. We investigate some of these connections and provide new reductions and other results.

Cite as

Marek Cygan, Marcin Mucha, Karol Wegrzycki, and Michal Wlodarczyk. On Problems Equivalent to (min,+)-Convolution. In 44th International Colloquium on Automata, Languages, and Programming (ICALP 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 80, pp. 22:1-22:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{cygan_et_al:LIPIcs.ICALP.2017.22,
  author =	{Cygan, Marek and Mucha, Marcin and Wegrzycki, Karol and Wlodarczyk, Michal},
  title =	{{On Problems Equivalent to (min,+)-Convolution}},
  booktitle =	{44th International Colloquium on Automata, Languages, and Programming (ICALP 2017)},
  pages =	{22:1--22:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-041-5},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{80},
  editor =	{Chatzigiannakis, Ioannis and Indyk, Piotr and Kuhn, Fabian and Muscholl, Anca},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2017.22},
  URN =		{urn:nbn:de:0030-drops-74216},
  doi =		{10.4230/LIPIcs.ICALP.2017.22},
  annote =	{Keywords: fine-grained complexity, knapsack, conditional lower bounds, (min,+)-convolution, subquadratic equivalence}
}
Document
When the Optimum is also Blind: a New Perspective on Universal Optimization

Authors: Marek Adamczyk, Fabrizio Grandoni, Stefano Leonardi, and Michal Wlodarczyk

Published in: LIPIcs, Volume 80, 44th International Colloquium on Automata, Languages, and Programming (ICALP 2017)


Abstract
Consider the following variant of the set cover problem. We are given a universe U={1,...,n} and a collection of subsets C = {S_1,...,S_m} where each S_i is a subset of U. For every element u from U we need to find a set phi(u) from collection C such that u belongs to phi(u). Once we construct and fix the mapping phi from U to C a subset X from the universe U is revealed, and we need to cover all elements from X with exactly phi(X), that is {phi(u)}_{all u from X}. The goal is to find a mapping such that the cover phi(X) is as cheap as possible. This is an example of a universal problem where the solution has to be created before the actual instance to deal with is revealed. Such problems appear naturally in some settings when we need to optimize under uncertainty and it may be actually too expensive to begin finding a good solution once the input starts being revealed. A rich body of work was devoted to investigate such problems under the regime of worst case analysis, i.e., when we measure how good the solution is by looking at the worst-case ratio: universal solution for a given instance vs optimum solution for the same instance. As the universal solution is significantly more constrained, it is typical that such a worst-case ratio is actually quite big. One way to give a viewpoint on the problem that would be less vulnerable to such extreme worst-cases is to assume that the instance, for which we will have to create a solution, will be drawn randomly from some probability distribution. In this case one wants to minimize the expected value of the ratio: universal solution vs optimum solution. Here the bounds obtained are indeed smaller than when we compare to the worst-case ratio. But even in this case we still compare apples to oranges as no universal solution is able to construct the optimum solution for every possible instance. What if we would compare our approximate universal solution against an optimal universal solution that obeys the same rules as we do? We show that under this viewpoint, but still in the stochastic variant, we can indeed obtain better bounds than in the expected ratio model. For example, for the set cover problem we obtain $H_n$ approximation which matches the approximation ratio from the classic deterministic setup. Moreover, we show this for all possible probability distributions over $U$ that have a polynomially large carrier, while all previous results pertained to a model in which elements were sampled independently. Our result is based on rounding a proper configuration IP that captures the optimal universal solution, and using tools from submodular optimization. The same basic approach leads to improved approximation algorithms for other related problems, including Vertex Cover, Edge Cover, Directed Steiner Tree, Multicut, and Facility Location.

Cite as

Marek Adamczyk, Fabrizio Grandoni, Stefano Leonardi, and Michal Wlodarczyk. When the Optimum is also Blind: a New Perspective on Universal Optimization. In 44th International Colloquium on Automata, Languages, and Programming (ICALP 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 80, pp. 35:1-35:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{adamczyk_et_al:LIPIcs.ICALP.2017.35,
  author =	{Adamczyk, Marek and Grandoni, Fabrizio and Leonardi, Stefano and Wlodarczyk, Michal},
  title =	{{When the Optimum is also Blind: a New Perspective on Universal Optimization}},
  booktitle =	{44th International Colloquium on Automata, Languages, and Programming (ICALP 2017)},
  pages =	{35:1--35:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-041-5},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{80},
  editor =	{Chatzigiannakis, Ioannis and Indyk, Piotr and Kuhn, Fabian and Muscholl, Anca},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2017.35},
  URN =		{urn:nbn:de:0030-drops-74436},
  doi =		{10.4230/LIPIcs.ICALP.2017.35},
  annote =	{Keywords: approximation algorithms, stochastic optimization, submodularity}
}
Document
Clifford Algebras Meet Tree Decompositions

Authors: Michal Wlodarczyk

Published in: LIPIcs, Volume 63, 11th International Symposium on Parameterized and Exact Computation (IPEC 2016)


Abstract
We introduce the Non-commutative Subset Convolution - a convolution of functions useful when working with determinant-based algorithms. In order to compute it efficiently, we take advantage of Clifford algebras, a generalization of quaternions used mainly in the quantum field theory. We apply this tool to speed up algorithms counting subgraphs parameterized by the treewidth of a graph. We present an O^*((2^omega + 1)^{tw})-time algorithm for counting Steiner trees and an O^*((2^omega + 2)^{tw})-time algorithm for counting Hamiltonian cycles, both of which improve the previously known upper bounds. The result for Steiner Tree also translates into a deterministic algorithm for Feedback Vertex Set. All of these constitute the best known running times of deterministic algorithms for decision versions of these problems and they match the best obtained running times for pathwidth parameterization under assumption omega = 2.

Cite as

Michal Wlodarczyk. Clifford Algebras Meet Tree Decompositions. In 11th International Symposium on Parameterized and Exact Computation (IPEC 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 63, pp. 29:1-29:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{wlodarczyk:LIPIcs.IPEC.2016.29,
  author =	{Wlodarczyk, Michal},
  title =	{{Clifford Algebras Meet Tree Decompositions}},
  booktitle =	{11th International Symposium on Parameterized and Exact Computation (IPEC 2016)},
  pages =	{29:1--29:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-023-1},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{63},
  editor =	{Guo, Jiong and Hermelin, Danny},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2016.29},
  URN =		{urn:nbn:de:0030-drops-69260},
  doi =		{10.4230/LIPIcs.IPEC.2016.29},
  annote =	{Keywords: fixed-parameter tractability, treewidth, Clifford algebra, algebra isomorphism}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail