Search Results

Documents authored by Rizzi, Romeo


Document
Generalizing Roberts' Characterization of Unit Interval Graphs

Authors: Virginia Ardévol Martínez, Romeo Rizzi, Abdallah Saffidine, Florian Sikora, and Stéphane Vialette

Published in: LIPIcs, Volume 306, 49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024)


Abstract
For any natural number d, a graph G is a (disjoint) d-interval graph if it is the intersection graph of (disjoint) d-intervals, the union of d (disjoint) intervals on the real line. Two important subclasses of d-interval graphs are unit and balanced d-interval graphs (where every interval has unit length or all the intervals associated to a same vertex have the same length, respectively). A celebrated result by Roberts gives a simple characterization of unit interval graphs being exactly claw-free interval graphs. Here, we study the generalization of this characterization for d-interval graphs. In particular, we prove that for any d ⩾ 2, if G is a K_{1,2d+1}-free interval graph, then G is a unit d-interval graph. However, somehow surprisingly, under the same assumptions, G is not always a disjoint unit d-interval graph. This implies that the class of disjoint unit d-interval graphs is strictly included in the class of unit d-interval graphs. Finally, we study the relationships between the classes obtained under disjoint and non-disjoint d-intervals in the balanced case and show that the classes of disjoint balanced 2-intervals and balanced 2-intervals coincide, but this is no longer true for d > 2.

Cite as

Virginia Ardévol Martínez, Romeo Rizzi, Abdallah Saffidine, Florian Sikora, and Stéphane Vialette. Generalizing Roberts' Characterization of Unit Interval Graphs. In 49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 306, pp. 12:1-12:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{ardevolmartinez_et_al:LIPIcs.MFCS.2024.12,
  author =	{Ard\'{e}vol Mart{\'\i}nez, Virginia and Rizzi, Romeo and Saffidine, Abdallah and Sikora, Florian and Vialette, St\'{e}phane},
  title =	{{Generalizing Roberts' Characterization of Unit Interval Graphs}},
  booktitle =	{49th International Symposium on Mathematical Foundations of Computer Science (MFCS 2024)},
  pages =	{12:1--12:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-335-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{306},
  editor =	{Kr\'{a}lovi\v{c}, Rastislav and Ku\v{c}era, Anton{\'\i}n},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2024.12},
  URN =		{urn:nbn:de:0030-drops-205687},
  doi =		{10.4230/LIPIcs.MFCS.2024.12},
  annote =	{Keywords: Interval graphs, Multiple Interval Graphs, Unit Interval Graphs, Characterization}
}
Document
Accelerating ILP Solvers for Minimum Flow Decompositions Through Search Space and Dimensionality Reductions

Authors: Andreas Grigorjew, Fernando H. C. Dias, Andrea Cracco, Romeo Rizzi, and Alexandru I. Tomescu

Published in: LIPIcs, Volume 301, 22nd International Symposium on Experimental Algorithms (SEA 2024)


Abstract
Given a flow network, the Minimum Flow Decomposition (MFD) problem is finding the smallest possible set of weighted paths whose superposition equals the flow. It is a classical, strongly NP-hard problem that is proven to be useful in RNA transcript assembly and applications outside of Bioinformatics. We improve an existing ILP (Integer Linear Programming) model by Dias et al. [RECOMB 2022] for DAGs by decreasing the solver’s search space using solution safety and several other optimizations. This results in a significant speedup compared to the original ILP, of up to 34× on average on the hardest instances. Moreover, we show that our optimizations apply also to MFD problem variants, resulting in speedups that go up to 219× on the hardest instances. We also developed an ILP model of reduced dimensionality for an MFD variant in which the solution path weights are restricted to a given set. This model can find an optimal MFD solution for most instances, and overall, its accuracy significantly outperforms that of previous greedy algorithms while being up to an order of magnitude faster than our optimized ILP.

Cite as

Andreas Grigorjew, Fernando H. C. Dias, Andrea Cracco, Romeo Rizzi, and Alexandru I. Tomescu. Accelerating ILP Solvers for Minimum Flow Decompositions Through Search Space and Dimensionality Reductions. In 22nd International Symposium on Experimental Algorithms (SEA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 301, pp. 14:1-14:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{grigorjew_et_al:LIPIcs.SEA.2024.14,
  author =	{Grigorjew, Andreas and Dias, Fernando H. C. and Cracco, Andrea and Rizzi, Romeo and Tomescu, Alexandru I.},
  title =	{{Accelerating ILP Solvers for Minimum Flow Decompositions Through Search Space and Dimensionality Reductions}},
  booktitle =	{22nd International Symposium on Experimental Algorithms (SEA 2024)},
  pages =	{14:1--14:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-325-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{301},
  editor =	{Liberti, Leo},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SEA.2024.14},
  URN =		{urn:nbn:de:0030-drops-203792},
  doi =		{10.4230/LIPIcs.SEA.2024.14},
  annote =	{Keywords: Flow decomposition, Integer Linear Programming, Safety, RNA-seq, RNA transcript assembly, isoform}
}
Document
Recognizing Unit Multiple Intervals Is Hard

Authors: Virginia Ardévol Martínez, Romeo Rizzi, Florian Sikora, and Stéphane Vialette

Published in: LIPIcs, Volume 283, 34th International Symposium on Algorithms and Computation (ISAAC 2023)


Abstract
Multiple interval graphs are a well-known generalization of interval graphs introduced in the 1970s to deal with situations arising naturally in scheduling and allocation. A d-interval is the union of d intervals on the real line, and a graph is a d-interval graph if it is the intersection graph of d-intervals. In particular, it is a unit d-interval graph if it admits a d-interval representation where every interval has unit length. Whereas it has been known for a long time that recognizing 2-interval graphs and other related classes such as 2-track interval graphs is NP-complete, the complexity of recognizing unit 2-interval graphs remains open. Here, we settle this question by proving that the recognition of unit 2-interval graphs is also NP-complete. Our proof technique uses a completely different approach from the other hardness results of recognizing related classes. Furthermore, we extend the result for unit d-interval graphs for any d ⩾ 2, which does not follow directly in graph recognition problems -as an example, it took almost 20 years to close the gap between d = 2 and d > 2 for the recognition of d-track interval graphs. Our result has several implications, including that recognizing (x, …, x) d-interval graphs and depth r unit 2-interval graphs is NP-complete for every x ⩾ 11 and every r ⩾ 4.

Cite as

Virginia Ardévol Martínez, Romeo Rizzi, Florian Sikora, and Stéphane Vialette. Recognizing Unit Multiple Intervals Is Hard. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 8:1-8:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{ardevolmartinez_et_al:LIPIcs.ISAAC.2023.8,
  author =	{Ard\'{e}vol Mart{\'\i}nez, Virginia and Rizzi, Romeo and Sikora, Florian and Vialette, St\'{e}phane},
  title =	{{Recognizing Unit Multiple Intervals Is Hard}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{8:1--8:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.8},
  URN =		{urn:nbn:de:0030-drops-193102},
  doi =		{10.4230/LIPIcs.ISAAC.2023.8},
  annote =	{Keywords: Interval graphs, unit multiple interval graphs, recognition, NP-hardness}
}
Document
Discovering Predictive Dependencies on Multi-Temporal Relations

Authors: Beatrice Amico, Carlo Combi, Romeo Rizzi, and Pietro Sala

Published in: LIPIcs, Volume 278, 30th International Symposium on Temporal Representation and Reasoning (TIME 2023)


Abstract
In this paper, we propose a methodology for deriving a new kind of approximate temporal functional dependencies, called Approximate Predictive Functional Dependencies (APFDs), based on a three-window framework and on a multi-temporal relational model. Different features are proposed for the Observation Window (OW), where we observe predictive data, for the Waiting Window (WW), and for the Prediction Window (PW), where the predicted event occurs. We then discuss the concept of approximation for such APFDs, introduce two new error measures. We prove that the problem of deriving APFDs is intractable. Moreover, we discuss some preliminary results in deriving APFDs from real clinical data using MIMIC III dataset, related to patients from Intensive Care Units.

Cite as

Beatrice Amico, Carlo Combi, Romeo Rizzi, and Pietro Sala. Discovering Predictive Dependencies on Multi-Temporal Relations. In 30th International Symposium on Temporal Representation and Reasoning (TIME 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 278, pp. 4:1-4:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{amico_et_al:LIPIcs.TIME.2023.4,
  author =	{Amico, Beatrice and Combi, Carlo and Rizzi, Romeo and Sala, Pietro},
  title =	{{Discovering Predictive Dependencies on Multi-Temporal Relations}},
  booktitle =	{30th International Symposium on Temporal Representation and Reasoning (TIME 2023)},
  pages =	{4:1--4:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-298-3},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{278},
  editor =	{Artikis, Alexander and Bruse, Florian and Hunsberger, Luke},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TIME.2023.4},
  URN =		{urn:nbn:de:0030-drops-190945},
  doi =		{10.4230/LIPIcs.TIME.2023.4},
  annote =	{Keywords: temporal databases, temporal data mining, functional dependencies}
}
Document
Cut Paths and Their Remainder Structure, with Applications

Authors: Massimo Cairo, Shahbaz Khan, Romeo Rizzi, Sebastian Schmidt, Alexandru I. Tomescu, and Elia C. Zirondelli

Published in: LIPIcs, Volume 254, 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)


Abstract
In a strongly connected graph G = (V,E), a cut arc (also called strong bridge) is an arc e ∈ E whose removal makes the graph no longer strongly connected. Equivalently, there exist u,v ∈ V, such that all u-v walks contain e. Cut arcs are a fundamental graph-theoretic notion, with countless applications, especially in reachability problems. In this paper we initiate the study of cut paths, as a generalisation of cut arcs, which we naturally define as those paths P for which there exist u,v ∈ V, such that all u-v walks contain P as subwalk. We first prove various properties of cut paths and define their remainder structures, which we use to present a simple O(m)-time verification algorithm for a cut path (|V| = n, |E| = m). Secondly, we apply cut paths and their remainder structures to improve several reachability problems from bioinformatics, as follows. A walk is called safe if it is a subwalk of every node-covering closed walk of a strongly connected graph. Multi-safety is defined analogously, by considering node-covering sets of closed walks instead. We show that cut paths provide simple O(m)-time algorithms verifying if a walk is safe or multi-safe. For multi-safety, we present the first linear time algorithm, while for safety, we present a simple algorithm where the state-of-the-art employed complex data structures. Finally we show that the simultaneous computation of remainder structures of all subwalks of a cut path can be performed in linear time, since they are related in a structured way. These properties yield an O(mn)-time algorithm outputting all maximal multi-safe walks, improving over the state-of-the-art algorithm running in time O(m²+n³). The results of this paper only scratch the surface in the study of cut paths, and we believe a rich structure of a graph can be revealed, considering the perspective of a path, instead of just an arc.

Cite as

Massimo Cairo, Shahbaz Khan, Romeo Rizzi, Sebastian Schmidt, Alexandru I. Tomescu, and Elia C. Zirondelli. Cut Paths and Their Remainder Structure, with Applications. In 40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 254, pp. 17:1-17:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{cairo_et_al:LIPIcs.STACS.2023.17,
  author =	{Cairo, Massimo and Khan, Shahbaz and Rizzi, Romeo and Schmidt, Sebastian and Tomescu, Alexandru I. and Zirondelli, Elia C.},
  title =	{{Cut Paths and Their Remainder Structure, with Applications}},
  booktitle =	{40th International Symposium on Theoretical Aspects of Computer Science (STACS 2023)},
  pages =	{17:1--17:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-266-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{254},
  editor =	{Berenbrink, Petra and Bouyer, Patricia and Dawar, Anuj and Kant\'{e}, Mamadou Moustapha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2023.17},
  URN =		{urn:nbn:de:0030-drops-176690},
  doi =		{10.4230/LIPIcs.STACS.2023.17},
  annote =	{Keywords: reachability, cut arc, strong bridge, covering walk, safety, persistence, essentiality, genome assembly}
}
Document
Width Helps and Hinders Splitting Flows

Authors: Manuel Cáceres, Massimo Cairo, Andreas Grigorjew, Shahbaz Khan, Brendan Mumey, Romeo Rizzi, Alexandru I. Tomescu, and Lucia Williams

Published in: LIPIcs, Volume 244, 30th Annual European Symposium on Algorithms (ESA 2022)


Abstract
Minimum flow decomposition (MFD) is the NP-hard problem of finding a smallest decomposition of a network flow X on directed graph G into weighted source-to-sink paths whose superposition equals X. We focus on a common formulation of the problem where the path weights must be non-negative integers and also on a new variant where these weights can be negative. We show that, for acyclic graphs, considering the width of the graph (the minimum number of s-t paths needed to cover all of its edges) yields advances in our understanding of its approximability. For the non-negative version, we show that a popular heuristic is a O(log |X|)-approximation (|X| being the total flow of X) on graphs satisfying two properties related to the width (satisfied by e.g., series-parallel graphs), and strengthen its worst-case approximation ratio from Ω(√m) to Ω(m / log m) for sparse graphs, where m is the number of edges in the graph. For the negative version, we give a (⌈log ║X║⌉+1)-approximation (║X║ being the maximum absolute value of X on any edge) using a power-of-two approach, combined with parity fixing arguments and a decomposition of unitary flows (║X║ ≤ 1) into at most width paths. We also disprove a conjecture about the linear independence of minimum (non-negative) flow decompositions posed by Kloster et al. [ALENEX 2018], but show that its useful implication (polynomial-time assignments of weights to a given set of paths to decompose a flow) holds for the negative version.

Cite as

Manuel Cáceres, Massimo Cairo, Andreas Grigorjew, Shahbaz Khan, Brendan Mumey, Romeo Rizzi, Alexandru I. Tomescu, and Lucia Williams. Width Helps and Hinders Splitting Flows. In 30th Annual European Symposium on Algorithms (ESA 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 244, pp. 31:1-31:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{caceres_et_al:LIPIcs.ESA.2022.31,
  author =	{C\'{a}ceres, Manuel and Cairo, Massimo and Grigorjew, Andreas and Khan, Shahbaz and Mumey, Brendan and Rizzi, Romeo and Tomescu, Alexandru I. and Williams, Lucia},
  title =	{{Width Helps and Hinders Splitting Flows}},
  booktitle =	{30th Annual European Symposium on Algorithms (ESA 2022)},
  pages =	{31:1--31:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-247-1},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{244},
  editor =	{Chechik, Shiri and Navarro, Gonzalo and Rotenberg, Eva and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2022.31},
  URN =		{urn:nbn:de:0030-drops-169695},
  doi =		{10.4230/LIPIcs.ESA.2022.31},
  annote =	{Keywords: Flow decomposition, approximation algorithms, graph width}
}
Document
Track A: Algorithms, Complexity and Games
Genome Assembly, from Practice to Theory: Safe, Complete and Linear-Time

Authors: Massimo Cairo, Romeo Rizzi, Alexandru I. Tomescu, and Elia C. Zirondelli

Published in: LIPIcs, Volume 198, 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)


Abstract
Genome assembly asks to reconstruct an unknown string from many shorter substrings of it. Even though it is one of the key problems in Bioinformatics, it is generally lacking major theoretical advances. Its hardness stems both from practical issues (size and errors of real data), and from the fact that problem formulations inherently admit multiple solutions. Given these, at their core, most state-of-the-art assemblers are based on finding non-branching paths (unitigs) in an assembly graph. While such paths constitute only partial assemblies, they are likely to be correct. More precisely, if one defines a genome assembly solution as a closed arc-covering walk of the graph, then unitigs appear in all solutions, being thus safe partial solutions. Until recently, it was open what are all the safe walks of an assembly graph. Tomescu and Medvedev (RECOMB 2016) characterized all such safe walks (omnitigs), thus giving the first safe and complete genome assembly algorithm. Even though omnitig finding was later improved to quadratic time, it remained open whether the crucial linear-time feature of finding unitigs can be attained with omnitigs. We answer this question affirmatively, by describing a surprising O(m)-time algorithm to identify all maximal omnitigs of a graph with n nodes and m arcs, notwithstanding the existence of families of graphs with Θ(mn) total maximal omnitig size. This is based on the discovery of a family of walks (macrotigs) with the property that all the non-trivial omnitigs are univocal extensions of subwalks of a macrotig. This has two consequences: (1) A linear-time output-sensitive algorithm enumerating all maximal omnitigs. (2) A compact O(m) representation of all maximal omnitigs, which allows, e.g., for O(m)-time computation of various statistics on them. Our results close a long-standing theoretical question inspired by practical genome assemblers, originating with the use of unitigs in 1995. We envision our results to be at the core of a reverse transfer from theory to practical and complete genome assembly programs, as has been the case for other key Bioinformatics problems.

Cite as

Massimo Cairo, Romeo Rizzi, Alexandru I. Tomescu, and Elia C. Zirondelli. Genome Assembly, from Practice to Theory: Safe, Complete and Linear-Time. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 43:1-43:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{cairo_et_al:LIPIcs.ICALP.2021.43,
  author =	{Cairo, Massimo and Rizzi, Romeo and Tomescu, Alexandru I. and Zirondelli, Elia C.},
  title =	{{Genome Assembly, from Practice to Theory: Safe, Complete and Linear-Time}},
  booktitle =	{48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)},
  pages =	{43:1--43:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-195-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{198},
  editor =	{Bansal, Nikhil and Merelli, Emanuela and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2021.43},
  URN =		{urn:nbn:de:0030-drops-141122},
  doi =		{10.4230/LIPIcs.ICALP.2021.43},
  annote =	{Keywords: Graph algorithm, strong connectivity, reachability under failures}
}
Document
Hybrid SAT-Based Consistency Checking Algorithms for Simple Temporal Networks with Decisions

Authors: Matteo Zavatteri, Carlo Combi, Romeo Rizzi, and Luca Viganò

Published in: LIPIcs, Volume 147, 26th International Symposium on Temporal Representation and Reasoning (TIME 2019)


Abstract
A Simple Temporal Network (STN) consists of time points modeling temporal events and constraints modeling the minimal and maximal temporal distance between them. A Simple Temporal Network with Decisions (STND) extends an STN by adding decision time points to model temporal plans with decisions. A decision time point is a special kind of time point that once executed allows for deciding a truth value for an associated Boolean proposition. Furthermore, STNDs label time points and constraints by conjunctions of literals saying for which scenarios (i.e., complete truth value assignments to the propositions) they are relevant. Thus, an STND models a family of STNs each obtained as a projection of the initial STND onto a scenario. An STND is consistent if there exists a consistent scenario (i.e., a scenario such that the corresponding STN projection is consistent). Recently, a hybrid SAT-based consistency checking algorithm (HSCC) was proposed to check the consistency of an STND. Unfortunately, that approach lacks experimental evaluation and does not allow for the synthesis of all consistent scenarios. In this paper, we propose an incremental HSCC algorithm for STNDs that (i) is faster than the previous one and (ii) allows for the synthesis of all consistent scenarios and related early execution schedules (offline temporal planning). Then, we carry out an experimental evaluation with KAPPA, a tool that we developed for STNDs. Finally, we prove that STNDs and disjunctive temporal networks (DTNs) are equivalent.

Cite as

Matteo Zavatteri, Carlo Combi, Romeo Rizzi, and Luca Viganò. Hybrid SAT-Based Consistency Checking Algorithms for Simple Temporal Networks with Decisions. In 26th International Symposium on Temporal Representation and Reasoning (TIME 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 147, pp. 16:1-16:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{zavatteri_et_al:LIPIcs.TIME.2019.16,
  author =	{Zavatteri, Matteo and Combi, Carlo and Rizzi, Romeo and Vigan\`{o}, Luca},
  title =	{{Hybrid SAT-Based Consistency Checking Algorithms for Simple Temporal Networks with Decisions}},
  booktitle =	{26th International Symposium on Temporal Representation and Reasoning (TIME 2019)},
  pages =	{16:1--16:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-127-6},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{147},
  editor =	{Gamper, Johann and Pinchinat, Sophie and Sciavicco, Guido},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TIME.2019.16},
  URN =		{urn:nbn:de:0030-drops-113748},
  doi =		{10.4230/LIPIcs.TIME.2019.16},
  annote =	{Keywords: Simple temporal network with decisions, HSCC algorithms, incremental SAT-solving, disjunctive temporal network, KAPPA}
}
Document
Faster Dynamic Controllability Checking for Simple Temporal Networks with Uncertainty

Authors: Massimo Cairo, Luke Hunsberger, and Romeo Rizzi

Published in: LIPIcs, Volume 120, 25th International Symposium on Temporal Representation and Reasoning (TIME 2018)


Abstract
Simple Temporal Networks (STNs) are a well-studied model for representing and reasoning about time. An STN comprises a set of real-valued variables called time-points, together with a set of binary constraints, each of the form Y <= X+w. The problem of finding a feasible schedule (i.e., an assignment of real numbers to time-points such that all of the constraints are satisfied) is equivalent to the Single Source Shortest Path problem (SSSP) in the STN graph. Simple Temporal Networks with Uncertainty (STNUs) augment STNs to include contingent links that can be used, for example, to represent actions with uncertain durations. The duration of a contingent link is not controlled by the planner, but is instead controlled by a (possibly adversarial) environment. Each contingent link has the form, <A,l,u,C>, where 0 < l <= u < infty. Once the planner executes the activation time-point A, the environment must execute the contingent time-point C at some time A+Delta, where Delta in [l,u]. Crucially, the planner does not know the value of Delta in advance, but only discovers it when C executes. An STNU is dynamically controllable (DC) if there is a strategy that the planner can use to execute all of the non-contingent time-points, such that all of the constraints are guaranteed to be satisfied no matter which durations the environment chooses for the contingent links. The strategy can be dynamic in that it can react in real time to the contingent durations it observes. Recently, an upper bound of O(N^3) was given for the DC-checking problem for STNUs, where N is the number of time-points. This paper introduces a new algorithm, called the RUL^- algorithm, for solving the DC-checking problem for STNUs that improves on the O(N^3) bound. The worst-case complexity of the RUL^- algorithm is O(MN+K^2N+KN log N), where N is the number of time-points, M is the number of constraints, and K is the number of contingent time-points. If M is O(N^2), then the complexity reduces to O(N^3); however, in sparse graphs the complexity can be much less. For example, if M is O(N log N), and K is O(sqrt{N}), then the complexity of the RUL^- algorithm reduces to O(N^2 log N). The RUL^- algorithm begins by using the Bellman-Ford algorithm to compute a potential function. It then performs at most 2K rounds of computations, interleaving novel applications of Dijkstra's algorithm to (1) generate new edges and (2) update the potential function in response to those new edges. The constraint-propagation/edge-generation rules used by the RUL^- algorithm are distinguished from related work in two ways. First, they only generate unlabeled edges. Second, their applicability conditions are more restrictive. As a result, the RUL^- algorithm requires only O(K) rounds of Dijkstra's algorithm, instead of the O(N) rounds required by other approaches. The paper proves that the RUL^- algorithm is sound and complete for the DC-checking problem for STNUs.

Cite as

Massimo Cairo, Luke Hunsberger, and Romeo Rizzi. Faster Dynamic Controllability Checking for Simple Temporal Networks with Uncertainty. In 25th International Symposium on Temporal Representation and Reasoning (TIME 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 120, pp. 8:1-8:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{cairo_et_al:LIPIcs.TIME.2018.8,
  author =	{Cairo, Massimo and Hunsberger, Luke and Rizzi, Romeo},
  title =	{{Faster Dynamic Controllability Checking for Simple Temporal Networks with Uncertainty}},
  booktitle =	{25th International Symposium on Temporal Representation and Reasoning (TIME 2018)},
  pages =	{8:1--8:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-089-7},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{120},
  editor =	{Alechina, Natasha and N{\o}rv\r{a}g, Kjetil and Penczek, Wojciech},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TIME.2018.8},
  URN =		{urn:nbn:de:0030-drops-97734},
  doi =		{10.4230/LIPIcs.TIME.2018.8},
  annote =	{Keywords: Simple Temporal Networks with Uncertainty, Dynamic Controllability, Temporal Planning under Uncertainty}
}
Document
On Restricted Disjunctive Temporal Problems: Faster Algorithms and Tractability Frontier

Authors: Carlo Comin and Romeo Rizzi

Published in: LIPIcs, Volume 120, 25th International Symposium on Temporal Representation and Reasoning (TIME 2018)


Abstract
In 2005 T.K.S. Kumar studied the Restricted Disjunctive Temporal Problem (RDTP), a restricted but very expressive class of Disjunctive Temporal Problems (DTPs). An RDTP comes with a finite set of temporal variables, and a finite set of temporal constraints each of which can be either one of the following three types: (t_1) two-variable linear-difference simple constraint; (t_2) single-variable disjunction of many interval constraints; (t_3) two-variable disjunction of two interval constraints only. Kumar showed that RDTPs are solvable in deterministic strongly polynomial time by reducing them to the Connected Row-Convex (CRC) constraints satisfaction problem, also devising a faster randomized algorithm. Instead, the most general form of DTPs allows for multi-variable disjunctions of many interval constraints and it is NP-complete. This work offers a deeper comprehension on the tractability of RDTPs, leading to an elementary deterministic strongly polynomial time algorithm for them, significantly improving the asymptotic running times of all the previous deterministic and randomized solutions. The result is obtained by reducing RDTPs to the Single-Source Shortest Paths (SSSP) and the 2-SAT problem (jointly), instead of reducing to CRCs. In passing, we obtain a faster (quadratic time) algorithm for RDTPs having only {t_1, t_2}-constraints and no t_3-constraint. As a second main contribution, we study the tractability frontier of solving RDTPs blended with Hyper Temporal Networks (HyTNs), a disjunctive strict generalization of Simple Temporal Networks (STNs) based on hypergraphs: we prove that solving temporal problems having only t_2-constraints and either only multi-tail or only multi-head hyperarc-constraints lies in NP cap co-NP and admits deterministic pseudo-polynomial time algorithms; on the other hand, problems having only t_3-constraints and either only multi-tail or only multi-head hyperarc-constraints turns out strongly NP-complete.

Cite as

Carlo Comin and Romeo Rizzi. On Restricted Disjunctive Temporal Problems: Faster Algorithms and Tractability Frontier. In 25th International Symposium on Temporal Representation and Reasoning (TIME 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 120, pp. 10:1-10:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{comin_et_al:LIPIcs.TIME.2018.10,
  author =	{Comin, Carlo and Rizzi, Romeo},
  title =	{{On Restricted Disjunctive Temporal Problems: Faster Algorithms and Tractability Frontier}},
  booktitle =	{25th International Symposium on Temporal Representation and Reasoning (TIME 2018)},
  pages =	{10:1--10:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-089-7},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{120},
  editor =	{Alechina, Natasha and N{\o}rv\r{a}g, Kjetil and Penczek, Wojciech},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TIME.2018.10},
  URN =		{urn:nbn:de:0030-drops-97751},
  doi =		{10.4230/LIPIcs.TIME.2018.10},
  annote =	{Keywords: Restricted Disjuctive Temporal Problems, Simple Temporal Networks, Hyper Temporal Networks, Consistency Checking, Single-Source Shortest-Paths, 2-SAT}
}
Document
Listing Subgraphs by Cartesian Decomposition

Authors: Alessio Conte, Roberto Grossi, Andrea Marino, Romeo Rizzi, and Luca Versari

Published in: LIPIcs, Volume 117, 43rd International Symposium on Mathematical Foundations of Computer Science (MFCS 2018)


Abstract
We investigate a decomposition technique for listing problems in graphs and set systems. It is based on the Cartesian product of some iterators, which list the solutions of simpler problems. Our ideas applies to several problems, and we illustrate one of them in depth, namely, listing all minimum spanning trees of a weighted graph G. Here iterators over the spanning trees for unweighted graphs can be obtained by a suitable modification of the listing algorithm by [Shioura et al., SICOMP 1997], and the decomposition of G is obtained by suitably partitioning its edges according to their weights. By combining these iterators in a Cartesian product scheme that employs Gray coding, we give the first algorithm which lists all minimum spanning trees of G in constant delay, where the delay is the time elapsed between any two consecutive outputs. Our solution requires polynomial preprocessing time and uses polynomial space.

Cite as

Alessio Conte, Roberto Grossi, Andrea Marino, Romeo Rizzi, and Luca Versari. Listing Subgraphs by Cartesian Decomposition. In 43rd International Symposium on Mathematical Foundations of Computer Science (MFCS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 117, pp. 84:1-84:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{conte_et_al:LIPIcs.MFCS.2018.84,
  author =	{Conte, Alessio and Grossi, Roberto and Marino, Andrea and Rizzi, Romeo and Versari, Luca},
  title =	{{Listing Subgraphs by Cartesian Decomposition}},
  booktitle =	{43rd International Symposium on Mathematical Foundations of Computer Science (MFCS 2018)},
  pages =	{84:1--84:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-086-6},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{117},
  editor =	{Potapov, Igor and Spirakis, Paul and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2018.84},
  URN =		{urn:nbn:de:0030-drops-96666},
  doi =		{10.4230/LIPIcs.MFCS.2018.84},
  annote =	{Keywords: Graph algorithms, listing, minimum spanning trees, constant delay}
}
Document
Dynamic Controllability Made Simple

Authors: Massimo Cairo and Romeo Rizzi

Published in: LIPIcs, Volume 90, 24th International Symposium on Temporal Representation and Reasoning (TIME 2017)


Abstract
Simple Temporal Networks with Uncertainty (STNUs) are a well-studied model for representing temporal constraints, where some intervals (contingent links) have an unknown but bounded duration, discovered only during execution. An STNU is dynamically controllable (DC) if there exists a strategy to execute its time-points satisfying all the constraints, regardless of the actual duration of contingent links revealed during execution. In this work we present a new system of constraint propagation rules for STNUs, which is sound-and-complete for DC checking. Our system comprises just three rules which, differently from the ones proposed in all previous works, only generate unconditioned constraints. In particular, after applying our sound rules, the network remains an STNU in all respects. Moreover, our completeness proof is short and non-algorithmic, based on the explicit construction of a valid execution strategy. This is a substantial simplification of the theory which underlies all the polynomial-time algorithms for DC-checking. Our analysis also shows: (1) the existence of late execution strategies for STNUs, (2) the equivalence of several variants of the notion of DC, (3) the existence of a fast algorithm for real-time execution of STNUs, which runs in O(KN) total time in a network with K contingent links and N time points, considerably improving the previous O(N^3)-time bound.

Cite as

Massimo Cairo and Romeo Rizzi. Dynamic Controllability Made Simple. In 24th International Symposium on Temporal Representation and Reasoning (TIME 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 90, pp. 8:1-8:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{cairo_et_al:LIPIcs.TIME.2017.8,
  author =	{Cairo, Massimo and Rizzi, Romeo},
  title =	{{Dynamic Controllability Made Simple}},
  booktitle =	{24th International Symposium on Temporal Representation and Reasoning (TIME 2017)},
  pages =	{8:1--8:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-052-1},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{90},
  editor =	{Schewe, Sven and Schneider, Thomas and Wijsen, Jef},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TIME.2017.8},
  URN =		{urn:nbn:de:0030-drops-79136},
  doi =		{10.4230/LIPIcs.TIME.2017.8},
  annote =	{Keywords: Simple Temporal Network with Uncertainty, Dynamic controllability}
}
Document
Incorporating Decision Nodes into Conditional Simple Temporal Networks

Authors: Massimo Cairo, Carlo Combi, Carlo Comin, Luke Hunsberger, Roberto Posenato, Romeo Rizzi, and Matteo Zavatteri

Published in: LIPIcs, Volume 90, 24th International Symposium on Temporal Representation and Reasoning (TIME 2017)


Abstract
A Conditional Simple Temporal Network (CSTN) augments a Simple Temporal Network (STN) to include special time-points, called observation time-points. In a CSTN, the agent executing the network controls the execution of every time-point. However, each observation time-point has a unique propositional letter associated with it and, when the agent executes that time-point, the environment assigns a truth value to the corresponding letter. Thus, the agent observes but, does not control the assignment of truth values. A CSTN is dynamically consistent (DC) if there exists a strategy for executing its time-points such that all relevant constraints will be satisfied no matter which truth values the environment assigns to the propositional letters. Alternatively, in a Labeled Simple Temporal Network (Labeled STN) - also called a Temporal Plan with Choice - the agent executing the network controls the assignment of values to the so-called choice variables. Furthermore, the agent can make those assignments at any time. For this reason, a Labeled STN is equivalent to a Disjunctive Temporal Network. This paper incorporates both of the above extensions by augmenting a CSTN to include not only observation time-points but also decision time-points. A decision time-point is like an observation time-point in that it has an associated propositional letter whose value is determined when the decision time-point is executed. It differs in that the agent - not the environment - selects that value. The resulting network is called a CSTN with Decisions (CSTND). This paper shows that a CSTND generalizes both CSTNs and Labeled STNs, and proves that the problem of determining whether any given CSTND is dynamically consistent is PSPACE-complete. It also presents algorithms that address two sub-classes of CSTNDs: (1) those that contain only decision time-points; and (2) those in which all decisions are made before execution begins.

Cite as

Massimo Cairo, Carlo Combi, Carlo Comin, Luke Hunsberger, Roberto Posenato, Romeo Rizzi, and Matteo Zavatteri. Incorporating Decision Nodes into Conditional Simple Temporal Networks. In 24th International Symposium on Temporal Representation and Reasoning (TIME 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 90, pp. 9:1-9:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{cairo_et_al:LIPIcs.TIME.2017.9,
  author =	{Cairo, Massimo and Combi, Carlo and Comin, Carlo and Hunsberger, Luke and Posenato, Roberto and Rizzi, Romeo and Zavatteri, Matteo},
  title =	{{Incorporating Decision Nodes into Conditional Simple Temporal Networks}},
  booktitle =	{24th International Symposium on Temporal Representation and Reasoning (TIME 2017)},
  pages =	{9:1--9:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-052-1},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{90},
  editor =	{Schewe, Sven and Schneider, Thomas and Wijsen, Jef},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TIME.2017.9},
  URN =		{urn:nbn:de:0030-drops-79155},
  doi =		{10.4230/LIPIcs.TIME.2017.9},
  annote =	{Keywords: Conditional Simple Temporal Networks with Decisions, Dynamic Consistency, SAT Solver, Hyper Temporal Networks, PSPACE}
}
Document
A Streamlined Model of Conditional Simple Temporal Networks - Semantics and Equivalence Results

Authors: Massimo Cairo, Luke Hunsberger, Roberto Posenato, and Romeo Rizzi

Published in: LIPIcs, Volume 90, 24th International Symposium on Temporal Representation and Reasoning (TIME 2017)


Abstract
A Conditional Simple Temporal Network (CSTN) augments a Simple Temporal Network to include a new kind of time-points, called observation time-points. The execution of an observation time-point generates information in real time, specifically, the truth value of a propositional letter. In addition, time-points and temporal constraints may be labeled by conjunctions of (positive or negative) propositional letters. A CSTN is called dynamically consistent (DC) if there exists a dynamic strategy for executing its time-points such that no matter how the observations turn out during execution, the time-points whose labels are consistent with those observations have all been executed, and the constraints whose labels are consistent with those observations have all been satisfied. The strategy is dynamic in that its execution decisions may react to observations. The original formulation of CSTNs included propositional labels only on time-points, but the DC-checking algorithm was impractical because it was based on a conversion of the semantic constraints into an exponentially-sized Disjunctive Temporal Network. Later work added propositional labels to temporal constraints, and yielded a sound-and-complete propagation-based DC-checking algorithm, empirically demonstrated to be practical across a variety of CSTNs. This paper introduces a streamlined version of a CSTN in which propositional labels may appear on constraints, but not on time-points. This change simplifies the definition of the DC property, as well as the propagation rules for the DC-checking algorithm. It also simplifies the proofs of the soundness and completeness of those rules. This paper provides two translations from traditional CSTNs to streamlined CSTNs. Each translation preserves the DC property and, for any DC network, ensures that any dynamic execution strategy for that network can be extended to a strategy for its streamlined counterpart. Finally, this paper presents an empirical comparison of two versions of the DC-checking algorithm: the original version and a simplified version for streamlined CSTNs. The comparison is based on CSTN benchmarks from earlier work. For small-sized CSTNs, the original version shows the best performance, but the performance difference between the two versions decreases as the number of time-points in the CSTN increases. We conclude that the simplified algorithm is a practical alternative for checking the dynamic consistency of CSTNs.

Cite as

Massimo Cairo, Luke Hunsberger, Roberto Posenato, and Romeo Rizzi. A Streamlined Model of Conditional Simple Temporal Networks - Semantics and Equivalence Results. In 24th International Symposium on Temporal Representation and Reasoning (TIME 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 90, pp. 10:1-10:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{cairo_et_al:LIPIcs.TIME.2017.10,
  author =	{Cairo, Massimo and Hunsberger, Luke and Posenato, Roberto and Rizzi, Romeo},
  title =	{{A Streamlined Model of Conditional Simple Temporal Networks - Semantics and Equivalence Results}},
  booktitle =	{24th International Symposium on Temporal Representation and Reasoning (TIME 2017)},
  pages =	{10:1--10:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-052-1},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{90},
  editor =	{Schewe, Sven and Schneider, Thomas and Wijsen, Jef},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.TIME.2017.10},
  URN =		{urn:nbn:de:0030-drops-79145},
  doi =		{10.4230/LIPIcs.TIME.2017.10},
  annote =	{Keywords: Conditional Simple Temporal Networks, Dynamic Consistency, Temporal Constraints}
}
Document
Optimal Omnitig Listing for Safe and Complete Contig Assembly

Authors: Massimo Cairo, Paul Medvedev, Nidia Obscura Acosta, Romeo Rizzi, and Alexandru I. Tomescu

Published in: LIPIcs, Volume 78, 28th Annual Symposium on Combinatorial Pattern Matching (CPM 2017)


Abstract
Genome assembly is the problem of reconstructing a genome sequence from a set of reads from a sequencing experiment. Typical formulations of the assembly problem admit in practice many genomic reconstructions, and actual genome assemblers usually output contigs, namely substrings that are promised to occur in the genome. To bridge the theory and practice, Tomescu and Medvedev [RECOMB 2016] reformulated contig assembly as finding all substrings common to all genomic reconstructions. They also gave a characterization of those walks (omnitigs) that are common to all closed edge-covering walks of a (directed) graph, a typical notion of genomic reconstruction. An algorithm for listing all maximal omnitigs was also proposed, by launching an exhaustive visit from every edge. In this paper, we prove new insights about the structure of omnitigs and solve several open questions about them. We combine these to achieve an O(nm)-time algorithm for outputting all the maximal omnitigs of a graph (with n nodes and m edges). This is also optimal, as we show families of graphs whose total omnitig length is Omega(nm). We implement this algorithm and show that it is 9-12 times faster in practice than the one of Tomescu and Medvedev [RECOMB 2016].

Cite as

Massimo Cairo, Paul Medvedev, Nidia Obscura Acosta, Romeo Rizzi, and Alexandru I. Tomescu. Optimal Omnitig Listing for Safe and Complete Contig Assembly. In 28th Annual Symposium on Combinatorial Pattern Matching (CPM 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 78, pp. 29:1-29:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{cairo_et_al:LIPIcs.CPM.2017.29,
  author =	{Cairo, Massimo and Medvedev, Paul and Obscura Acosta, Nidia and Rizzi, Romeo and Tomescu, Alexandru I.},
  title =	{{Optimal Omnitig Listing for Safe and Complete Contig Assembly}},
  booktitle =	{28th Annual Symposium on Combinatorial Pattern Matching (CPM 2017)},
  pages =	{29:1--29:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-039-2},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{78},
  editor =	{K\"{a}rkk\"{a}inen, Juha and Radoszewski, Jakub and Rytter, Wojciech},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CPM.2017.29},
  URN =		{urn:nbn:de:0030-drops-73423},
  doi =		{10.4230/LIPIcs.CPM.2017.29},
  annote =	{Keywords: genome assembly, graph algorithm, edge-covering walk, strong bridge}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail