9 Search Results for "Koivisto, Mikko"


Document
Finding Optimal Triangulations Parameterized by Edge Clique Cover

Authors: Tuukka Korhonen

Published in: LIPIcs, Volume 180, 15th International Symposium on Parameterized and Exact Computation (IPEC 2020)


Abstract
Many graph problems can be formulated as a task of finding an optimal triangulation of a given graph with respect to some notion of optimality. In this paper we give algorithms to such problems parameterized by the size of a minimum edge clique cover (cc) of the graph. The parameter cc is both natural and well-motivated in many problems on this setting. For example, in the perfect phylogeny problem cc is at most the number of taxa, in fractional hypertreewidth cc is at most the number of hyperedges, and in treewidth of Bayesian networks cc is at most the number of non-root nodes of the Bayesian network. Our results are based on the framework of potential maximal cliques. We show that the number of minimal separators of graphs is at most 2^cc and the number of potential maximal cliques is at most 3^cc. Furthermore, these objects can be listed in times O^*(2^cc) and O^*(3^cc), respectively, even when no edge clique cover is given as input; the O^*(⋅) notation omits factors polynomial in the input size. Using these enumeration algorithms we obtain O^*(3^cc) time algorithms for problems in the potential maximal clique framework, including for example treewidth, minimum fill-in, and feedback vertex set. We also obtain an O^*(3^m) time algorithm for fractional hypertreewidth, where m is the number of hyperedges. In the case when an edge clique cover of size cc' is given as an input we further improve the time complexity to O^*(2^cc') for treewidth, minimum fill-in, and chordal sandwich. This implies an O^*(2^n) time algorithm for perfect phylogeny, where n is the number of taxa. We also give polynomial space algorithms with time complexities O^*(9^cc') and O^*(9^(cc + O(log^2 cc))) for problems in this framework.

Cite as

Tuukka Korhonen. Finding Optimal Triangulations Parameterized by Edge Clique Cover. In 15th International Symposium on Parameterized and Exact Computation (IPEC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 180, pp. 22:1-22:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{korhonen:LIPIcs.IPEC.2020.22,
  author =	{Korhonen, Tuukka},
  title =	{{Finding Optimal Triangulations Parameterized by Edge Clique Cover}},
  booktitle =	{15th International Symposium on Parameterized and Exact Computation (IPEC 2020)},
  pages =	{22:1--22:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-172-6},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{180},
  editor =	{Cao, Yixin and Pilipczuk, Marcin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2020.22},
  URN =		{urn:nbn:de:0030-drops-133253},
  doi =		{10.4230/LIPIcs.IPEC.2020.22},
  annote =	{Keywords: Treewidth, Minimum fill-in, Perfect phylogeny, Fractional hypertreewidth, Potential maximal cliques, Edge clique cover}
}
Document
Fast Multi-Subset Transform and Weighted Sums over Acyclic Digraphs

Authors: Mikko Koivisto and Antti Röyskö

Published in: LIPIcs, Volume 162, 17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020)


Abstract
The zeta and Moebius transforms over the subset lattice of n elements and the so-called subset convolution are examples of unary and binary operations on set functions. While their direct computation requires O(3ⁿ) arithmetic operations, less naive algorithms only use 2ⁿ poly(n) operations, nearly linear in the input size. Here, we investigate a related n-ary operation that takes n set functions as input and maps them to a new set function. This operation, we call multi-subset transform, is the core ingredient in the known inclusion - exclusion recurrence for weighted sums over acyclic digraphs, which extends Robinson’s recurrence for the number of labelled acyclic digraphs. Prior to this work, the best known complexity bound for computing the multi-subset transform was the direct O(3ⁿ). By reducing the task to rectangular matrix multiplication, we improve the complexity to O(2.985ⁿ).

Cite as

Mikko Koivisto and Antti Röyskö. Fast Multi-Subset Transform and Weighted Sums over Acyclic Digraphs. In 17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 162, pp. 29:1-29:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{koivisto_et_al:LIPIcs.SWAT.2020.29,
  author =	{Koivisto, Mikko and R\"{o}ysk\"{o}, Antti},
  title =	{{Fast Multi-Subset Transform and Weighted Sums over Acyclic Digraphs}},
  booktitle =	{17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020)},
  pages =	{29:1--29:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-150-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{162},
  editor =	{Albers, Susanne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SWAT.2020.29},
  URN =		{urn:nbn:de:0030-drops-122768},
  doi =		{10.4230/LIPIcs.SWAT.2020.29},
  annote =	{Keywords: Bayesian networks, Moebius transform, Rectangular matrix multiplication, Subset convolution, Weighted counting of acyclic digraphs, Zeta transform}
}
Document
Patching Colors with Tensors

Authors: Cornelius Brand

Published in: LIPIcs, Volume 144, 27th Annual European Symposium on Algorithms (ESA 2019)


Abstract
We describe a generic way of exponentially speeding up algorithms which rely on Color-Coding by using the recently introduced technique of Extensor-Coding (Brand, Dell and Husfeldt, STOC 2018). To demonstrate the usefulness of this "patching" of Color-Coding algorithms, we apply it ad hoc to the exponential-space algorithms given in Gutin et al. (Journal Comp. Sys. Sci. 2018) and obtain the fastest known deterministic algorithms for, among others, the k-internal out-branching and k-internal spanning tree problems. To realize these technical advances, we make qualitative progress in a special case of the detection of multilinear monomials in multivariate polynomials: We give the first deterministic fixed-parameter tractable algorithm for the k-multilinear detection problem on a class of arithmetic circuits that may involve cancellations, as long as the computed polynomial is promised to satisfy a certain natural condition. Furthermore, we explore the limitations of using this very approach to speed up algorithms by determining exactly the dimension of a crucial subalgebra of extensors that arises naturally in the instantiation of the technique: It is equal to F_{2k+1}, the kth odd term in the Fibonacci sequence. To determine this dimension, we use tools from the theory of Gröbner bases, and the studied algebraic object may be of independent interest. We note that the asymptotic bound of F_{2k+1} ~~ phi^(2k) = O(2.619^k) curiously coincides with the running time bound on one of the fastest algorithms for the k-path problem based on representative sets due to Fomin et al. (JACM 2016). Here, phi is the golden ratio.

Cite as

Cornelius Brand. Patching Colors with Tensors. In 27th Annual European Symposium on Algorithms (ESA 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 144, pp. 25:1-25:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{brand:LIPIcs.ESA.2019.25,
  author =	{Brand, Cornelius},
  title =	{{Patching Colors with Tensors}},
  booktitle =	{27th Annual European Symposium on Algorithms (ESA 2019)},
  pages =	{25:1--25:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-124-5},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{144},
  editor =	{Bender, Michael A. and Svensson, Ola and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2019.25},
  URN =		{urn:nbn:de:0030-drops-111467},
  doi =		{10.4230/LIPIcs.ESA.2019.25},
  annote =	{Keywords: Color-Coding, Extensor-Coding, internal out-branching, colorful problems, algebraic algorithms, multilinear detection, deterministic algorithms, exterior algebra}
}
Document
A Faster Tree-Decomposition Based Algorithm for Counting Linear Extensions

Authors: Kustaa Kangas, Mikko Koivisto, and Sami Salonen

Published in: LIPIcs, Volume 115, 13th International Symposium on Parameterized and Exact Computation (IPEC 2018)


Abstract
We consider the problem of counting the linear extensions of an n-element poset whose cover graph has treewidth at most t. We show that the problem can be solved in time O~(n^{t+3}), where O~ suppresses logarithmic factors. Our algorithm is based on fast multiplication of multivariate polynomials, and so differs radically from a previous O~(n^{t+4})-time inclusion - exclusion algorithm. We also investigate the algorithm from a practical point of view. We observe that the running time is not well characterized by the parameters n and t alone, fixing of which leaves large variance in running times due to uncontrolled features of the selected optimal-width tree decomposition. For selecting an efficient tree decomposition we adopt the method of empirical hardness models, and show that it typically enables picking a tree decomposition that is significantly more efficient than a random optimal-width tree decomposition.

Cite as

Kustaa Kangas, Mikko Koivisto, and Sami Salonen. A Faster Tree-Decomposition Based Algorithm for Counting Linear Extensions. In 13th International Symposium on Parameterized and Exact Computation (IPEC 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 115, pp. 5:1-5:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{kangas_et_al:LIPIcs.IPEC.2018.5,
  author =	{Kangas, Kustaa and Koivisto, Mikko and Salonen, Sami},
  title =	{{A Faster Tree-Decomposition Based Algorithm for Counting Linear Extensions}},
  booktitle =	{13th International Symposium on Parameterized and Exact Computation (IPEC 2018)},
  pages =	{5:1--5:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-084-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{115},
  editor =	{Paul, Christophe and Pilipczuk, Michal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2018.5},
  URN =		{urn:nbn:de:0030-drops-102062},
  doi =		{10.4230/LIPIcs.IPEC.2018.5},
  annote =	{Keywords: Algorithm selection, empirical hardness, linear extension, multiplication of polynomials, tree decomposition}
}
Document
Counting Connected Subgraphs with Maximum-Degree-Aware Sieving

Authors: Andreas Björklund, Thore Husfeldt, Petteri Kaski, and Mikko Koivisto

Published in: LIPIcs, Volume 123, 29th International Symposium on Algorithms and Computation (ISAAC 2018)


Abstract
We study the problem of counting the isomorphic occurrences of a k-vertex pattern graph P as a subgraph in an n-vertex host graph G. Our specific interest is on algorithms for subgraph counting that are sensitive to the maximum degree Delta of the host graph. Assuming that the pattern graph P is connected and admits a vertex balancer of size b, we present an algorithm that counts the occurrences of P in G in O ((2 Delta-2)^{(k+b)/2} 2^{-b} n/(Delta) k^2 log n) time. We define a balancer as a vertex separator of P that can be represented as an intersection of two equal-size vertex subsets, the union of which is the vertex set of P, and both of which induce connected subgraphs of P. A corollary of our main result is that we can count the number of k-vertex paths in an n-vertex graph in O((2 Delta-2)^{floor[k/2]} n k^2 log n) time, which for all moderately dense graphs with Delta <= n^{1/3} improves on the recent breakthrough work of Curticapean, Dell, and Marx [STOC 2017], who show how to count the isomorphic occurrences of a q-edge pattern graph as a subgraph in an n-vertex host graph in time O(q^q n^{0.17q}) for all large enough q. Another recent result of Brand, Dell, and Husfeldt [STOC 2018] shows that k-vertex paths in a bounded-degree graph can be approximately counted in O(4^kn) time. Our result shows that the exact count can be recovered at least as fast for Delta<10. Our algorithm is based on the principle of inclusion and exclusion, and can be viewed as a sparsity-sensitive version of the "counting in halves"-approach explored by Björklund, Husfeldt, Kaski, and Koivisto [ESA 2009].

Cite as

Andreas Björklund, Thore Husfeldt, Petteri Kaski, and Mikko Koivisto. Counting Connected Subgraphs with Maximum-Degree-Aware Sieving. In 29th International Symposium on Algorithms and Computation (ISAAC 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 123, pp. 17:1-17:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{bjorklund_et_al:LIPIcs.ISAAC.2018.17,
  author =	{Bj\"{o}rklund, Andreas and Husfeldt, Thore and Kaski, Petteri and Koivisto, Mikko},
  title =	{{Counting Connected Subgraphs with Maximum-Degree-Aware Sieving}},
  booktitle =	{29th International Symposium on Algorithms and Computation (ISAAC 2018)},
  pages =	{17:1--17:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-094-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{123},
  editor =	{Hsu, Wen-Lian and Lee, Der-Tsai and Liao, Chung-Shou},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2018.17},
  URN =		{urn:nbn:de:0030-drops-99655},
  doi =		{10.4230/LIPIcs.ISAAC.2018.17},
  annote =	{Keywords: graph embedding, k-path, subgraph counting, maximum degree}
}
Document
Dense Subset Sum May Be the Hardest

Authors: Per Austrin, Petteri Kaski, Mikko Koivisto, and Jesper Nederlof

Published in: LIPIcs, Volume 47, 33rd Symposium on Theoretical Aspects of Computer Science (STACS 2016)


Abstract
The SUBSET SUM problem asks whether a given set of n positive integers contains a subset of elements that sum up to a given target t. It is an outstanding open question whether the O^*(2^{n/2})-time algorithm for SUBSET SUM by Horowitz and Sahni [J. ACM 1974] can be beaten in the worst-case setting by a "truly faster", O^*(2^{(0.5-delta)*n})-time algorithm, with some constant delta > 0. Continuing an earlier work [STACS 2015], we study SUBSET SUM parameterized by the maximum bin size beta, defined as the largest number of subsets of the n input integers that yield the same sum. For every epsilon > 0 we give a truly faster algorithm for instances with beta <= 2^{(0.5-epsilon)*n}, as well as instances with beta >= 2^{0.661n}. Consequently, we also obtain a characterization in terms of the popular density parameter n/log_2(t): if all instances of density at least 1.003 admit a truly faster algorithm, then so does every instance. This goes against the current intuition that instances of density 1 are the hardest, and therefore is a step toward answering the open question in the affirmative. Our results stem from a novel combinatorial analysis of mixings of earlier algorithms for SUBSET SUM and a study of an extremal question in additive combinatorics connected to the problem of Uniquely Decodable Code Pairs in information theory.

Cite as

Per Austrin, Petteri Kaski, Mikko Koivisto, and Jesper Nederlof. Dense Subset Sum May Be the Hardest. In 33rd Symposium on Theoretical Aspects of Computer Science (STACS 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 47, pp. 13:1-13:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{austrin_et_al:LIPIcs.STACS.2016.13,
  author =	{Austrin, Per and Kaski, Petteri and Koivisto, Mikko and Nederlof, Jesper},
  title =	{{Dense Subset Sum May Be the Hardest}},
  booktitle =	{33rd Symposium on Theoretical Aspects of Computer Science (STACS 2016)},
  pages =	{13:1--13:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-001-9},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{47},
  editor =	{Ollinger, Nicolas and Vollmer, Heribert},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2016.13},
  URN =		{urn:nbn:de:0030-drops-57143},
  doi =		{10.4230/LIPIcs.STACS.2016.13},
  annote =	{Keywords: subset sum, additive combinatorics, exponential-time algorithm, homo-morphic hashing, littlewood–offord problem}
}
Document
Subset Sum in the Absence of Concentration

Authors: Per Austrin, Petteri Kaski, Mikko Koivisto, and Jesper Nederlof

Published in: LIPIcs, Volume 30, 32nd International Symposium on Theoretical Aspects of Computer Science (STACS 2015)


Abstract
We study the exact time complexity of the Subset Sum problem. Our focus is on instances that lack additive structure in the sense that the sums one can form from the subsets of the given integers are not strongly concentrated on any particular integer value. We present a randomized algorithm that runs in O(2^0.3399nB^4) time on instances with the property that no value can arise as a sum of more than B different subsets of the n given integers.

Cite as

Per Austrin, Petteri Kaski, Mikko Koivisto, and Jesper Nederlof. Subset Sum in the Absence of Concentration. In 32nd International Symposium on Theoretical Aspects of Computer Science (STACS 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 30, pp. 48-61, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{austrin_et_al:LIPIcs.STACS.2015.48,
  author =	{Austrin, Per and Kaski, Petteri and Koivisto, Mikko and Nederlof, Jesper},
  title =	{{Subset Sum in the Absence of Concentration}},
  booktitle =	{32nd International Symposium on Theoretical Aspects of Computer Science (STACS 2015)},
  pages =	{48--61},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-78-1},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{30},
  editor =	{Mayr, Ernst W. and Ollinger, Nicolas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2015.48},
  URN =		{urn:nbn:de:0030-drops-49034},
  doi =		{10.4230/LIPIcs.STACS.2015.48},
  annote =	{Keywords: subset sum, additive combinatorics, exponential-time algorithm, homomorphic hashing, Littlewood--Offord problem}
}
Document
08431 Open Problems – Moderately Exponential Time Algorithms

Authors: Fedor V. Fomin, Kazuo Iwama, Dieter Kratsch, Petteri Kaski, Mikko Koivisto, Lukasz Kowalik, Yoshio Okamoto, Johan van Rooij, and Ryan Williams

Published in: Dagstuhl Seminar Proceedings, Volume 8431, Moderately Exponential Time Algorithms (2008)


Abstract
Two problem sessions were part of the seminar on Moderately Exponential Time Algorithms. Some of the open problems presented at those sessions have been collected.

Cite as

Fedor V. Fomin, Kazuo Iwama, Dieter Kratsch, Petteri Kaski, Mikko Koivisto, Lukasz Kowalik, Yoshio Okamoto, Johan van Rooij, and Ryan Williams. 08431 Open Problems – Moderately Exponential Time Algorithms. In Moderately Exponential Time Algorithms. Dagstuhl Seminar Proceedings, Volume 8431, pp. 1-8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{fomin_et_al:DagSemProc.08431.3,
  author =	{Fomin, Fedor V. and Iwama, Kazuo and Kratsch, Dieter and Kaski, Petteri and Koivisto, Mikko and Kowalik, Lukasz and Okamoto, Yoshio and van Rooij, Johan and Williams, Ryan},
  title =	{{08431 Open Problems – Moderately Exponential Time Algorithms}},
  booktitle =	{Moderately Exponential Time Algorithms},
  pages =	{1--8},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2008},
  volume =	{8431},
  editor =	{Fedor V. Fomin and Kazuo Iwama and Dieter Kratsch},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.08431.3},
  URN =		{urn:nbn:de:0030-drops-17986},
  doi =		{10.4230/DagSemProc.08431.3},
  annote =	{Keywords: Algorithms, NP-hard problems, Moderately Exponential Time Algorithms}
}
Document
Trimmed Moebius Inversion and Graphs of Bounded Degree

Authors: Andreas Björklund, Thore Husfeldt, Petteri Kaski, and Mikko Koivisto

Published in: LIPIcs, Volume 1, 25th International Symposium on Theoretical Aspects of Computer Science (2008)


Abstract
We study ways to expedite Yates's algorithm for computing the zeta and Moebius transforms of a function defined on the subset lattice. We develop a trimmed variant of Moebius inversion that proceeds point by point, finishing the calculation at a subset before considering its supersets. For an $n$-element universe $U$ and a family $scr F$ of its subsets, trimmed Moebius inversion allows us to compute the number of packings, coverings, and partitions of $U$ with $k$ sets from $scr F$ in time within a polynomial factor (in $n$) of the number of supersets of the members of $scr F$. Relying on an intersection theorem of Chung et al. (1986) to bound the sizes of set families, we apply these ideas to well-studied combinatorial optimisation problems on graphs of maximum degree $Delta$. In particular, we show how to compute the Domatic Number in time within a polynomial factor of $(2^{Delta+1-2)^{n/(Delta+1)$ and the Chromatic Number in time within a polynomial factor of $(2^{Delta+1-Delta-1)^{n/(Delta+1)$. For any constant $Delta$, these bounds are $O bigl((2-epsilon)^n bigr)$ for $epsilon>0$ independent of the number of vertices $n$.

Cite as

Andreas Björklund, Thore Husfeldt, Petteri Kaski, and Mikko Koivisto. Trimmed Moebius Inversion and Graphs of Bounded Degree. In 25th International Symposium on Theoretical Aspects of Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 1, pp. 85-96, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{bjorklund_et_al:LIPIcs.STACS.2008.1336,
  author =	{Bj\"{o}rklund, Andreas and Husfeldt, Thore and Kaski, Petteri and Koivisto, Mikko},
  title =	{{Trimmed Moebius Inversion and Graphs of Bounded Degree}},
  booktitle =	{25th International Symposium on Theoretical Aspects of Computer Science},
  pages =	{85--96},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-06-4},
  ISSN =	{1868-8969},
  year =	{2008},
  volume =	{1},
  editor =	{Albers, Susanne and Weil, Pascal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2008.1336},
  URN =		{urn:nbn:de:0030-drops-13369},
  doi =		{10.4230/LIPIcs.STACS.2008.1336},
  annote =	{Keywords: }
}
  • Refine by Author
  • 7 Koivisto, Mikko
  • 5 Kaski, Petteri
  • 2 Austrin, Per
  • 2 Björklund, Andreas
  • 2 Husfeldt, Thore
  • Show More...

  • Refine by Classification
  • 3 Theory of computation → Parameterized complexity and exact algorithms
  • 2 Theory of computation → Graph algorithms analysis
  • 1 Computing methodologies → Algebraic algorithms
  • 1 Mathematics of computing → Graph algorithms
  • 1 Theory of computation → Algorithm design techniques
  • Show More...

  • Refine by Keyword
  • 2 additive combinatorics
  • 2 exponential-time algorithm
  • 2 subset sum
  • 1 Algorithm selection
  • 1 Algorithms
  • Show More...

  • Refine by Type
  • 9 document

  • Refine by Publication Year
  • 2 2008
  • 2 2019
  • 2 2020
  • 1 2015
  • 1 2016
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail