12 Search Results for "Xu, Chang"


Document
Using Markov’s Inequality with Power-Of-k Function for Probabilistic WCET Estimation

Authors: Sergi Vilardell, Isabel Serra, Enrico Mezzetti, Jaume Abella, Francisco J. Cazorla, and Joan del Castillo

Published in: LIPIcs, Volume 231, 34th Euromicro Conference on Real-Time Systems (ECRTS 2022)


Abstract
Deriving WCET estimates for software programs with probabilistic means (a.k.a. pWCET estimation) has received significant attention during last years as a way to deal with the increased complexity of the processors used in real-time systems. Many works build on Extreme Value Theory (EVT) that is fed with a sample of the collected data (execution times). In its application, EVT carries two sources of uncertainty: the first one that is intrinsic to the EVT model and relates to determining the subset of the sample that belongs to the (upper) tail, and hence, is actually used by EVT for prediction; and the second one that is induced by the sampling process and hence is inherent to all sample-based methods. In this work, we show that Markov’s inequality can be used to obtain provable trustworthy probabilistic bounds to the tail of a distribution without incurring any model-intrinsic uncertainty. Yet, it produces pessimistic estimates that we shave substantially by proposing the use of a power-of-k function instead of the default identity function used by Markov’s inequality. Lastly, we propose a method to deal with sampling uncertainty for Markov’s inequality that consistently improves EVT estimates on synthetic and real data obtained from a railway application.

Cite as

Sergi Vilardell, Isabel Serra, Enrico Mezzetti, Jaume Abella, Francisco J. Cazorla, and Joan del Castillo. Using Markov’s Inequality with Power-Of-k Function for Probabilistic WCET Estimation. In 34th Euromicro Conference on Real-Time Systems (ECRTS 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 231, pp. 20:1-20:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{vilardell_et_al:LIPIcs.ECRTS.2022.20,
  author =	{Vilardell, Sergi and Serra, Isabel and Mezzetti, Enrico and Abella, Jaume and Cazorla, Francisco J. and del Castillo, Joan},
  title =	{{Using Markov’s Inequality with Power-Of-k Function for Probabilistic WCET Estimation}},
  booktitle =	{34th Euromicro Conference on Real-Time Systems (ECRTS 2022)},
  pages =	{20:1--20:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-239-6},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{231},
  editor =	{Maggio, Martina},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECRTS.2022.20},
  URN =		{urn:nbn:de:0030-drops-163377},
  doi =		{10.4230/LIPIcs.ECRTS.2022.20},
  annote =	{Keywords: Markov’s inequality, probabilistic time estimates, probabilistic WCET, Extreme Value Theory}
}
Document
Tackling the Awkward Squad for Reactive Programming: The Actor-Reactor Model

Authors: Sam Van den Vonder, Thierry Renaux, Bjarno Oeyen, Joeri De Koster, and Wolfgang De Meuter

Published in: LIPIcs, Volume 166, 34th European Conference on Object-Oriented Programming (ECOOP 2020)


Abstract
Reactive programming is a programming paradigm whereby programs are internally represented by a dependency graph, which is used to automatically (re)compute parts of a program whenever its input changes. In practice reactive programming can only be used for some parts of an application: a reactive program is usually embedded in an application that is still written in ordinary imperative languages such as JavaScript or Scala. In this paper we investigate this embedding and we distill "the awkward squad for reactive programming" as 3 concerns that are essential for real-world software development, but that do not fit within reactive programming. They are related to long lasting computations, side-effects, and the coordination between imperative and reactive code. To solve these issues we design a new programming model called the Actor-Reactor Model in which programs are split up in a number of actors and reactors. Actors and reactors enforce a strict separation of imperative and reactive code, and they can be composed via a number of composition operators that make use of data streams. We demonstrate the model via our own implementation in a language called Stella.

Cite as

Sam Van den Vonder, Thierry Renaux, Bjarno Oeyen, Joeri De Koster, and Wolfgang De Meuter. Tackling the Awkward Squad for Reactive Programming: The Actor-Reactor Model. In 34th European Conference on Object-Oriented Programming (ECOOP 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 166, pp. 19:1-19:29, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{vandenvonder_et_al:LIPIcs.ECOOP.2020.19,
  author =	{Van den Vonder, Sam and Renaux, Thierry and Oeyen, Bjarno and De Koster, Joeri and De Meuter, Wolfgang},
  title =	{{Tackling the Awkward Squad for Reactive Programming: The Actor-Reactor Model}},
  booktitle =	{34th European Conference on Object-Oriented Programming (ECOOP 2020)},
  pages =	{19:1--19:29},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-154-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{166},
  editor =	{Hirschfeld, Robert and Pape, Tobias},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECOOP.2020.19},
  URN =		{urn:nbn:de:0030-drops-131768},
  doi =		{10.4230/LIPIcs.ECOOP.2020.19},
  annote =	{Keywords: functional reactive programming, reactive programming, reactive streams, actors, reactors}
}
Document
Characteristic Logics for Behavioural Metrics via Fuzzy Lax Extensions

Authors: Paul Wild and Lutz Schröder

Published in: LIPIcs, Volume 171, 31st International Conference on Concurrency Theory (CONCUR 2020)


Abstract
Behavioural distances provide a fine-grained measure of equivalence in systems involving quantitative data, such as probabilistic, fuzzy, or metric systems. Like in the classical setting of crisp bisimulation-type equivalences, the wide variation found in system types creates a need for generic methods that apply to many system types at once. Approaches of this kind are emerging within the paradigm of universal coalgebra, based either on lifting pseudometrics along set functors or on lifting general real-valued (fuzzy) relations along functors by means of fuzzy lax extensions. An immediate benefit of the latter is that they allow bounding behavioural distance by means of fuzzy bisimulations that need not themselves be (pseudo-)metrics, in analogy to classical bisimulations (which need not be equivalence relations). The known instances of generic pseudometric liftings, specifically the generic Kantorovich and Wasserstein liftings, both can be extended to yield fuzzy lax extensions, using the fact that both are effectively given by a choice of quantitative modalities. Our central result then shows that in fact all fuzzy lax extensions are Kantorovich extensions for a suitable set of quantitative modalities, the so-called Moss modalities. For non-expansive fuzzy lax extensions, this allows for the extraction of quantitative modal logics that characterize behavioural distance, i.e. satisfy a quantitative version of the Hennessy-Milner theorem; equivalently, we obtain expressiveness of a quantitative version of Moss' coalgebraic logic.

Cite as

Paul Wild and Lutz Schröder. Characteristic Logics for Behavioural Metrics via Fuzzy Lax Extensions. In 31st International Conference on Concurrency Theory (CONCUR 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 171, pp. 27:1-27:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{wild_et_al:LIPIcs.CONCUR.2020.27,
  author =	{Wild, Paul and Schr\"{o}der, Lutz},
  title =	{{Characteristic Logics for Behavioural Metrics via Fuzzy Lax Extensions}},
  booktitle =	{31st International Conference on Concurrency Theory (CONCUR 2020)},
  pages =	{27:1--27:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-160-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{171},
  editor =	{Konnov, Igor and Kov\'{a}cs, Laura},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CONCUR.2020.27},
  URN =		{urn:nbn:de:0030-drops-128394},
  doi =		{10.4230/LIPIcs.CONCUR.2020.27},
  annote =	{Keywords: Modal logic, behavioural distance, coalgebra, bisimulation, lax extension}
}
Document
Jointly Embedding Multiple Single-Cell Omics Measurements

Authors: Jie Liu, Yuanhao Huang, Ritambhara Singh, Jean-Philippe Vert, and William Stafford Noble

Published in: LIPIcs, Volume 143, 19th International Workshop on Algorithms in Bioinformatics (WABI 2019)


Abstract
Many single-cell sequencing technologies are now available, but it is still difficult to apply multiple sequencing technologies to the same single cell. In this paper, we propose an unsupervised manifold alignment algorithm, MMD-MA, for integrating multiple measurements carried out on disjoint aliquots of a given population of cells. Effectively, MMD-MA performs an in silico co-assay by embedding cells measured in different ways into a learned latent space. In the MMD-MA algorithm, single-cell data points from multiple domains are aligned by optimizing an objective function with three components: (1) a maximum mean discrepancy (MMD) term to encourage the differently measured points to have similar distributions in the latent space, (2) a distortion term to preserve the structure of the data between the input space and the latent space, and (3) a penalty term to avoid collapse to a trivial solution. Notably, MMD-MA does not require any correspondence information across data modalities, either between the cells or between the features. Furthermore, MMD-MA’s weak distributional requirements for the domains to be aligned allow the algorithm to integrate heterogeneous types of single cell measures, such as gene expression, DNA accessibility, chromatin organization, methylation, and imaging data. We demonstrate the utility of MMD-MA in simulation experiments and using a real data set involving single-cell gene expression and methylation data.

Cite as

Jie Liu, Yuanhao Huang, Ritambhara Singh, Jean-Philippe Vert, and William Stafford Noble. Jointly Embedding Multiple Single-Cell Omics Measurements. In 19th International Workshop on Algorithms in Bioinformatics (WABI 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 143, pp. 10:1-10:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{liu_et_al:LIPIcs.WABI.2019.10,
  author =	{Liu, Jie and Huang, Yuanhao and Singh, Ritambhara and Vert, Jean-Philippe and Noble, William Stafford},
  title =	{{Jointly Embedding Multiple Single-Cell Omics Measurements}},
  booktitle =	{19th International Workshop on Algorithms in Bioinformatics (WABI 2019)},
  pages =	{10:1--10:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-123-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{143},
  editor =	{Huber, Katharina T. and Gusfield, Dan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.WABI.2019.10},
  URN =		{urn:nbn:de:0030-drops-110401},
  doi =		{10.4230/LIPIcs.WABI.2019.10},
  annote =	{Keywords: Manifold alignment, single-cell sequencing}
}
Document
Inferring Diploid 3D Chromatin Structures from Hi-C Data

Authors: Alexandra Gesine Cauer, Gürkan Yardımcı, Jean-Philippe Vert, Nelle Varoquaux, and William Stafford Noble

Published in: LIPIcs, Volume 143, 19th International Workshop on Algorithms in Bioinformatics (WABI 2019)


Abstract
The 3D organization of the genome plays a key role in many cellular processes, such as gene regulation, differentiation, and replication. Assays like Hi-C measure DNA-DNA contacts in a high-throughput fashion, and inferring accurate 3D models of chromosomes can yield insights hidden in the raw data. For example, structural inference can account for noise in the data, disambiguate the distinct structures of homologous chromosomes, orient genomic regions relative to nuclear landmarks, and serve as a framework for integrating other data types. Although many methods exist to infer the 3D structure of haploid genomes, inferring a diploid structure from Hi-C data is still an open problem. Indeed, the diploid case is very challenging, because Hi-C data typically does not distinguish between homologous chromosomes. We propose a method to infer 3D diploid genomes from Hi-C data. We demonstrate the accuracy of the method on simulated data, and we also use the method to infer 3D structures for mouse chromosome X, confirming that the active homolog exhibits a bipartite structure, whereas the active homolog does not.

Cite as

Alexandra Gesine Cauer, Gürkan Yardımcı, Jean-Philippe Vert, Nelle Varoquaux, and William Stafford Noble. Inferring Diploid 3D Chromatin Structures from Hi-C Data. In 19th International Workshop on Algorithms in Bioinformatics (WABI 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 143, pp. 11:1-11:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{cauer_et_al:LIPIcs.WABI.2019.11,
  author =	{Cauer, Alexandra Gesine and Yard{\i}mc{\i}, G\"{u}rkan and Vert, Jean-Philippe and Varoquaux, Nelle and Noble, William Stafford},
  title =	{{Inferring Diploid 3D Chromatin Structures from Hi-C Data}},
  booktitle =	{19th International Workshop on Algorithms in Bioinformatics (WABI 2019)},
  pages =	{11:1--11:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-123-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{143},
  editor =	{Huber, Katharina T. and Gusfield, Dan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.WABI.2019.11},
  URN =		{urn:nbn:de:0030-drops-110418},
  doi =		{10.4230/LIPIcs.WABI.2019.11},
  annote =	{Keywords: Genome 3D architecture, chromatin structure, Hi-C, 3D modeling}
}
Document
RLE Edit Distance in Near Optimal Time

Authors: Raphaël Clifford, Paweł Gawrychowski, Tomasz Kociumaka, Daniel P. Martin, and Przemysław Uznański

Published in: LIPIcs, Volume 138, 44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019)


Abstract
We show that the edit distance between two run-length encoded strings of compressed lengths m and n respectively, can be computed in O(mn log(mn)) time. This improves the previous record by a factor of O(n/log(mn)). The running time of our algorithm is within subpolynomial factors of being optimal, subject to the standard SETH-hardness assumption. This effectively closes a line of algorithmic research first started in 1993.

Cite as

Raphaël Clifford, Paweł Gawrychowski, Tomasz Kociumaka, Daniel P. Martin, and Przemysław Uznański. RLE Edit Distance in Near Optimal Time. In 44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 138, pp. 66:1-66:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{clifford_et_al:LIPIcs.MFCS.2019.66,
  author =	{Clifford, Rapha\"{e}l and Gawrychowski, Pawe{\l} and Kociumaka, Tomasz and Martin, Daniel P. and Uzna\'{n}ski, Przemys{\l}aw},
  title =	{{RLE Edit Distance in Near Optimal Time}},
  booktitle =	{44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019)},
  pages =	{66:1--66:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-117-7},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{138},
  editor =	{Rossmanith, Peter and Heggernes, Pinar and Katoen, Joost-Pieter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2019.66},
  URN =		{urn:nbn:de:0030-drops-110109},
  doi =		{10.4230/LIPIcs.MFCS.2019.66},
  annote =	{Keywords: String algorithms, Compression, Pattern matching, Run-length encoding}
}
Document
Listing Induced Steiner Subgraphs as a Compact Way to Discover Steiner Trees in Graphs

Authors: Alessio Conte, Roberto Grossi, Mamadou Moustapha Kanté, Andrea Marino, Takeaki Uno, and Kunihiro Wasa

Published in: LIPIcs, Volume 138, 44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019)


Abstract
This paper investigates induced Steiner subgraphs as a variant of the classical Steiner trees, so as to compactly represent the (exponentially many) Steiner trees sharing the same underlying induced subgraph. We prove that the enumeration of all (inclusion-minimal) induced Steiner subgraphs is harder than the well-known Hypergraph Transversal enumeration problem if the number of terminals is not fixed. When the number of terminals is fixed, we propose a polynomial delay algorithm for listing all induced Steiner subgraphs of minimum size. We also propose a polynomial delay algorithm for listing the set of minimal induced Steiner subgraphs when the number of terminals is 3.

Cite as

Alessio Conte, Roberto Grossi, Mamadou Moustapha Kanté, Andrea Marino, Takeaki Uno, and Kunihiro Wasa. Listing Induced Steiner Subgraphs as a Compact Way to Discover Steiner Trees in Graphs. In 44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 138, pp. 73:1-73:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{conte_et_al:LIPIcs.MFCS.2019.73,
  author =	{Conte, Alessio and Grossi, Roberto and Kant\'{e}, Mamadou Moustapha and Marino, Andrea and Uno, Takeaki and Wasa, Kunihiro},
  title =	{{Listing Induced Steiner Subgraphs as a Compact Way to Discover Steiner Trees in Graphs}},
  booktitle =	{44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019)},
  pages =	{73:1--73:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-117-7},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{138},
  editor =	{Rossmanith, Peter and Heggernes, Pinar and Katoen, Joost-Pieter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2019.73},
  URN =		{urn:nbn:de:0030-drops-110174},
  doi =		{10.4230/LIPIcs.MFCS.2019.73},
  annote =	{Keywords: Graph algorithms, enumeration, listing and counting, Steiner trees, induced subgraphs}
}
Document
Asymmetric Distances for Approximate Differential Privacy

Authors: Dmitry Chistikov, Andrzej S. Murawski, and David Purser

Published in: LIPIcs, Volume 140, 30th International Conference on Concurrency Theory (CONCUR 2019)


Abstract
Differential privacy is a widely studied notion of privacy for various models of computation, based on measuring differences between probability distributions. We consider (epsilon,delta)-differential privacy in the setting of labelled Markov chains. For a given epsilon, the parameter delta can be captured by a variant of the total variation distance, which we call lv_{alpha} (where alpha = e^{epsilon}). First we study lv_{alpha} directly, showing that it cannot be computed exactly. However, the associated approximation problem turns out to be in PSPACE and #P-hard. Next we introduce a new bisimilarity distance for bounding lv_{alpha} from above, which provides a tighter bound than previously known distances while remaining computable with the same complexity (polynomial time with an NP oracle). We also propose an alternative bound that can be computed in polynomial time. Finally, we illustrate the distances on case studies.

Cite as

Dmitry Chistikov, Andrzej S. Murawski, and David Purser. Asymmetric Distances for Approximate Differential Privacy. In 30th International Conference on Concurrency Theory (CONCUR 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 140, pp. 10:1-10:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{chistikov_et_al:LIPIcs.CONCUR.2019.10,
  author =	{Chistikov, Dmitry and Murawski, Andrzej S. and Purser, David},
  title =	{{Asymmetric Distances for Approximate Differential Privacy}},
  booktitle =	{30th International Conference on Concurrency Theory (CONCUR 2019)},
  pages =	{10:1--10:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-121-4},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{140},
  editor =	{Fokkink, Wan and van Glabbeek, Rob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CONCUR.2019.10},
  URN =		{urn:nbn:de:0030-drops-109121},
  doi =		{10.4230/LIPIcs.CONCUR.2019.10},
  annote =	{Keywords: Bisimilarity distances, Differential privacy, Labelled Markov chains}
}
Document
Automating Object Transformations for Dynamic Software Updating via Online Execution Synthesis

Authors: Tianxiao Gu, Xiaoxing Ma, Chang Xu, Yanyan Jiang, Chun Cao, and Jian Lu

Published in: LIPIcs, Volume 109, 32nd European Conference on Object-Oriented Programming (ECOOP 2018)


Abstract
Dynamic software updating (DSU) is a technique to upgrade a running software system on the fly without stopping the system. During updating, the runtime state of the modified components of the system needs to be properly transformed into a new state, so that the modified components can still correctly interact with the rest of the system. However, the transformation is non-trivial to realize due to the gap between the low-level implementations of two versions of a program. This paper presents AOTES, a novel approach to automating object transformations for dynamic updating of Java programs. AOTES bridges the gap by abstracting the old state of an object to a history of method invocations, and re-invoking the new version of all methods in the history to get the desired new state. AOTES requires no instrumentation to record any data and thus has no overhead during normal execution. We propose and implement a novel technique that can synthesize an equivalent history of method invocations based on the current object state only. We evaluated AOTES on software updates taken from Apache Commons Collections, Tomcat, FTP Server and SSHD Server. Experimental results show that AOTES successfully handled 51 of 61 object transformations of 21 updated classes, while two state-of-the-art approaches only handled 11 and 6 of 61, respectively.

Cite as

Tianxiao Gu, Xiaoxing Ma, Chang Xu, Yanyan Jiang, Chun Cao, and Jian Lu. Automating Object Transformations for Dynamic Software Updating via Online Execution Synthesis. In 32nd European Conference on Object-Oriented Programming (ECOOP 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 109, pp. 19:1-19:28, Schloss Dagstuhl - Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{gu_et_al:LIPIcs.ECOOP.2018.19,
  author =	{Gu, Tianxiao and Ma, Xiaoxing and Xu, Chang and Jiang, Yanyan and Cao, Chun and Lu, Jian},
  title =	{{Automating Object Transformations for Dynamic Software Updating via Online Execution Synthesis}},
  booktitle =	{32nd European Conference on Object-Oriented Programming (ECOOP 2018)},
  pages =	{19:1--19:28},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-079-8},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{109},
  editor =	{Millstein, Todd},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECOOP.2018.19},
  URN =		{urn:nbn:de:0030-drops-92243},
  doi =		{10.4230/LIPIcs.ECOOP.2018.19},
  annote =	{Keywords: Dynamic Software Update, Program Synthesis, Execution Synthesis}
}
Document
Recognizing Weakly Simple Polygons

Authors: Hugo A. Akitaya, Greg Aloupis, Jeff Erickson, and Csaba Tóth

Published in: LIPIcs, Volume 51, 32nd International Symposium on Computational Geometry (SoCG 2016)


Abstract
We present an O(n log n)-time algorithm that determines whether a given planar n-gon is weakly simple. This improves upon an O(n^2 log n)-time algorithm by [Chang, Erickson, and Xu, SODA, 2015]. Weakly simple polygons are required as input for several geometric algorithms. As such, how to recognize simple or weakly simple polygons is a fundamental question.

Cite as

Hugo A. Akitaya, Greg Aloupis, Jeff Erickson, and Csaba Tóth. Recognizing Weakly Simple Polygons. In 32nd International Symposium on Computational Geometry (SoCG 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 51, pp. 8:1-8:16, Schloss Dagstuhl - Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{akitaya_et_al:LIPIcs.SoCG.2016.8,
  author =	{Akitaya, Hugo A. and Aloupis, Greg and Erickson, Jeff and T\'{o}th, Csaba},
  title =	{{Recognizing Weakly Simple Polygons}},
  booktitle =	{32nd International Symposium on Computational Geometry (SoCG 2016)},
  pages =	{8:1--8:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-009-5},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{51},
  editor =	{Fekete, S\'{a}ndor and Lubiw, Anna},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2016.8},
  URN =		{urn:nbn:de:0030-drops-59003},
  doi =		{10.4230/LIPIcs.SoCG.2016.8},
  annote =	{Keywords: weakly simple polygon, crossing}
}
Document
Towards Trace Metrics via Functor Lifting

Authors: Paolo Baldan, Filippo Bonchi, Henning Kerstan, and Barbara König

Published in: LIPIcs, Volume 35, 6th Conference on Algebra and Coalgebra in Computer Science (CALCO 2015)


Abstract
We investigate the possibility of deriving metric trace semantics in a coalgebraic framework. First, we generalize a technique for systematically lifting functors from the category Set of sets to the category PMet of pseudometric spaces, by identifying conditions under which also natural transformations, monads and distributive laws can be lifted. By exploiting some recent work on an abstract determinization, these results enable the derivation of trace metrics starting from coalgebras in Set. More precisely, for a coalgebra in Set we determinize it, thus obtaining a coalgebra in the Eilenberg-Moore category of a monad. When the monad can be lifted to PMet, we can equip the final coalgebra with a behavioral distance. The trace distance between two states of the original coalgebra is the distance between their images in the determinized coalgebra through the unit of the monad. We show how our framework applies to nondeterministic automata and probabilistic automata.

Cite as

Paolo Baldan, Filippo Bonchi, Henning Kerstan, and Barbara König. Towards Trace Metrics via Functor Lifting. In 6th Conference on Algebra and Coalgebra in Computer Science (CALCO 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 35, pp. 35-49, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{baldan_et_al:LIPIcs.CALCO.2015.35,
  author =	{Baldan, Paolo and Bonchi, Filippo and Kerstan, Henning and K\"{o}nig, Barbara},
  title =	{{Towards Trace Metrics via Functor Lifting}},
  booktitle =	{6th Conference on Algebra and Coalgebra in Computer Science (CALCO 2015)},
  pages =	{35--49},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-84-2},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{35},
  editor =	{Moss, Lawrence S. and Sobocinski, Pawel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CALCO.2015.35},
  URN =		{urn:nbn:de:0030-drops-55254},
  doi =		{10.4230/LIPIcs.CALCO.2015.35},
  annote =	{Keywords: trace metric, monad lifting, pseudometric, coalgebra}
}
Document
PerfBlower: Quickly Detecting Memory-Related Performance Problems via Amplification

Authors: Lu Fang, Liang Dou, and Guoqing Xu

Published in: LIPIcs, Volume 37, 29th European Conference on Object-Oriented Programming (ECOOP 2015)


Abstract
Performance problems in managed languages are extremely difficult to find. Despite many efforts to find those problems, most existing work focuses on how to debug a user-provided test execution in which performance problems already manifest. It remains largely unknown how to effectively find performance bugs before software release. As a result, performance bugs often escape to production runs, hurting software reliability and user experience. This paper describes PerfBlower, a general performance testing framework that allows developers to quickly test Java programs to find memory-related performance problems. PerfBlower provides (1) a novel specification language ISL to describe a general class of performance problems that have observable symptoms; (2) an automated test oracle via \emph{virtual amplification}; and (3) precise reference-path-based diagnostic information via object mirroring. Using this framework, we have amplified three different types of problems. Our experimental results demonstrate that (1) ISL is expressive enough to describe various memory-related performance problems; (2) PerfBlower successfully distinguishes executions with and without problems; 8 unknown problems are quickly discovered under small workloads; and (3) PerfBlower outperforms existing detectors and does not miss any bugs studied before in the literature.

Cite as

Lu Fang, Liang Dou, and Guoqing Xu. PerfBlower: Quickly Detecting Memory-Related Performance Problems via Amplification. In 29th European Conference on Object-Oriented Programming (ECOOP 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 37, pp. 296-320, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{fang_et_al:LIPIcs.ECOOP.2015.296,
  author =	{Fang, Lu and Dou, Liang and Xu, Guoqing},
  title =	{{PerfBlower: Quickly Detecting Memory-Related Performance Problems via Amplification}},
  booktitle =	{29th European Conference on Object-Oriented Programming (ECOOP 2015)},
  pages =	{296--320},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-86-6},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{37},
  editor =	{Boyland, John Tang},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECOOP.2015.296},
  URN =		{urn:nbn:de:0030-drops-52278},
  doi =		{10.4230/LIPIcs.ECOOP.2015.296},
  annote =	{Keywords: Performance bugs, memory problems, managed languages, garbage collection\}}
}
  • Refine by Author
  • 2 Noble, William Stafford
  • 2 Vert, Jean-Philippe
  • 1 Abella, Jaume
  • 1 Akitaya, Hugo A.
  • 1 Aloupis, Greg
  • Show More...

  • Refine by Classification
  • 2 Applied computing → Computational biology
  • 1 Computer systems organization → Real-time system architecture
  • 1 Computing methodologies → Dimensionality reduction and manifold learning
  • 1 Computing methodologies → Machine learning algorithms
  • 1 Computing methodologies → Unsupervised learning
  • Show More...

  • Refine by Keyword
  • 2 coalgebra
  • 1 3D modeling
  • 1 Bisimilarity distances
  • 1 Compression
  • 1 Differential privacy
  • Show More...

  • Refine by Type
  • 12 document

  • Refine by Publication Year
  • 5 2019
  • 2 2015
  • 2 2020
  • 1 2016
  • 1 2018
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail