38 Search Results for "Thomas, Philip S."


Document
Mechanizing Soundness of Off-Policy Evaluation

Authors: Jared Yeager, J. Eliot B. Moss, Michael Norrish, and Philip S. Thomas

Published in: LIPIcs, Volume 237, 13th International Conference on Interactive Theorem Proving (ITP 2022)


Abstract
There are reinforcement learning scenarios - e.g., in medicine - where we are compelled to be as confident as possible that a policy change will result in an improvement before implementing it. In such scenarios, we can employ off-policy evaluation (OPE). The basic idea of OPE is to record histories of behaviors under the current policy, and then develop an estimate of the quality of a proposed new policy, seeing what the behavior would have been under the new policy. As we are evaluating the policy without actually using it, we have the "off-policy" of OPE. Applying a concentration inequality to the estimate, we derive a confidence interval for the expected quality of the new policy. If the confidence interval lies above that of the current policy, we can change policies with high confidence that we will do no harm. We focus here on the mathematics of this method, by mechanizing the soundness of off-policy evaluation. A natural side effect of the mechanization is both to clarify all the result’s mathematical assumptions and preconditions, and to further develop HOL4’s library of verified statistical mathematics, including concentration inequalities. Of more significance, the OPE method relies on importance sampling, whose soundness we prove using a measure-theoretic approach. In fact, we generalize the standard result, showing it for contexts comprising both discrete and continuous probability distributions.

Cite as

Jared Yeager, J. Eliot B. Moss, Michael Norrish, and Philip S. Thomas. Mechanizing Soundness of Off-Policy Evaluation. In 13th International Conference on Interactive Theorem Proving (ITP 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 237, pp. 32:1-32:20, Schloss Dagstuhl - Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{yeager_et_al:LIPIcs.ITP.2022.32,
  author =	{Yeager, Jared and Moss, J. Eliot B. and Norrish, Michael and Thomas, Philip S.},
  title =	{{Mechanizing Soundness of Off-Policy Evaluation}},
  booktitle =	{13th International Conference on Interactive Theorem Proving (ITP 2022)},
  pages =	{32:1--32:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-252-5},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{237},
  editor =	{Andronick, June and de Moura, Leonardo},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2022.32},
  URN =		{urn:nbn:de:0030-drops-167413},
  doi =		{10.4230/LIPIcs.ITP.2022.32},
  annote =	{Keywords: Formal Methods, HOL4, Reinforcement Learning, Off-Policy Evaluation, Concentration Inequality, Hoeffding}
}
Document
Tackling the Awkward Squad for Reactive Programming: The Actor-Reactor Model

Authors: Sam Van den Vonder, Thierry Renaux, Bjarno Oeyen, Joeri De Koster, and Wolfgang De Meuter

Published in: LIPIcs, Volume 166, 34th European Conference on Object-Oriented Programming (ECOOP 2020)


Abstract
Reactive programming is a programming paradigm whereby programs are internally represented by a dependency graph, which is used to automatically (re)compute parts of a program whenever its input changes. In practice reactive programming can only be used for some parts of an application: a reactive program is usually embedded in an application that is still written in ordinary imperative languages such as JavaScript or Scala. In this paper we investigate this embedding and we distill "the awkward squad for reactive programming" as 3 concerns that are essential for real-world software development, but that do not fit within reactive programming. They are related to long lasting computations, side-effects, and the coordination between imperative and reactive code. To solve these issues we design a new programming model called the Actor-Reactor Model in which programs are split up in a number of actors and reactors. Actors and reactors enforce a strict separation of imperative and reactive code, and they can be composed via a number of composition operators that make use of data streams. We demonstrate the model via our own implementation in a language called Stella.

Cite as

Sam Van den Vonder, Thierry Renaux, Bjarno Oeyen, Joeri De Koster, and Wolfgang De Meuter. Tackling the Awkward Squad for Reactive Programming: The Actor-Reactor Model. In 34th European Conference on Object-Oriented Programming (ECOOP 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 166, pp. 19:1-19:29, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{vandenvonder_et_al:LIPIcs.ECOOP.2020.19,
  author =	{Van den Vonder, Sam and Renaux, Thierry and Oeyen, Bjarno and De Koster, Joeri and De Meuter, Wolfgang},
  title =	{{Tackling the Awkward Squad for Reactive Programming: The Actor-Reactor Model}},
  booktitle =	{34th European Conference on Object-Oriented Programming (ECOOP 2020)},
  pages =	{19:1--19:29},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-154-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{166},
  editor =	{Hirschfeld, Robert and Pape, Tobias},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECOOP.2020.19},
  URN =		{urn:nbn:de:0030-drops-131768},
  doi =		{10.4230/LIPIcs.ECOOP.2020.19},
  annote =	{Keywords: functional reactive programming, reactive programming, reactive streams, actors, reactors}
}
Document
Drawing Graphs with Circular Arcs and Right-Angle Crossings

Authors: Steven Chaplick, Henry Förster, Myroslav Kryven, and Alexander Wolff

Published in: LIPIcs, Volume 162, 17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020)


Abstract
In a RAC drawing of a graph, vertices are represented by points in the plane, adjacent vertices are connected by line segments, and crossings must form right angles. Graphs that admit such drawings are RAC graphs. RAC graphs are beyond-planar graphs and have been studied extensively. In particular, it is known that a RAC graph with n vertices has at most 4n-10 edges. We introduce a superclass of RAC graphs, which we call arc-RAC graphs. A graph is arc-RAC if it admits a drawing where edges are represented by circular arcs and crossings form right angles. We provide a Turán-type result showing that an arc-RAC graph with n vertices has at most 14n-12 edges and that there are n-vertex arc-RAC graphs with 4.5n - O(√n) edges.

Cite as

Steven Chaplick, Henry Förster, Myroslav Kryven, and Alexander Wolff. Drawing Graphs with Circular Arcs and Right-Angle Crossings. In 17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 162, pp. 21:1-21:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{chaplick_et_al:LIPIcs.SWAT.2020.21,
  author =	{Chaplick, Steven and F\"{o}rster, Henry and Kryven, Myroslav and Wolff, Alexander},
  title =	{{Drawing Graphs with Circular Arcs and Right-Angle Crossings}},
  booktitle =	{17th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2020)},
  pages =	{21:1--21:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-150-4},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{162},
  editor =	{Albers, Susanne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SWAT.2020.21},
  URN =		{urn:nbn:de:0030-drops-122687},
  doi =		{10.4230/LIPIcs.SWAT.2020.21},
  annote =	{Keywords: circular arcs, right-angle crossings, edge density, charging argument}
}
Document
Generating Verified LLVM from Isabelle/HOL

Authors: Peter Lammich

Published in: LIPIcs, Volume 141, 10th International Conference on Interactive Theorem Proving (ITP 2019)


Abstract
We present a framework to generate verified LLVM programs from Isabelle/HOL. It is based on a code generator that generates LLVM text from a simplified fragment of LLVM, shallowly embedded into Isabelle/HOL. On top, we have developed a separation logic, a verification condition generator, and an LLVM backend to the Isabelle Refinement Framework. As case studies, we have produced verified LLVM implementations of binary search and the Knuth-Morris-Pratt string search algorithm. These are one order of magnitude faster than the Standard-ML implementations produced with the original Refinement Framework, and on par with unverified C implementations. Adoption of the original correctness proofs to the new LLVM backend was straightforward. The trusted code base of our approach is the shallow embedding of the LLVM fragment and the code generator, which is a pretty printer combined with some straightforward compilation steps.

Cite as

Peter Lammich. Generating Verified LLVM from Isabelle/HOL. In 10th International Conference on Interactive Theorem Proving (ITP 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 141, pp. 22:1-22:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{lammich:LIPIcs.ITP.2019.22,
  author =	{Lammich, Peter},
  title =	{{Generating Verified LLVM from Isabelle/HOL}},
  booktitle =	{10th International Conference on Interactive Theorem Proving (ITP 2019)},
  pages =	{22:1--22:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-122-1},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{141},
  editor =	{Harrison, John and O'Leary, John and Tolmach, Andrew},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2019.22},
  URN =		{urn:nbn:de:0030-drops-110777},
  doi =		{10.4230/LIPIcs.ITP.2019.22},
  annote =	{Keywords: Isabelle/HOL, LLVM, Separation Logic, Verification Condition Generator, Code Generation}
}
Document
Characteristic Formulae for Liveness Properties of Non-Terminating CakeML Programs

Authors: Johannes Åman Pohjola, Henrik Rostedt, and Magnus O. Myreen

Published in: LIPIcs, Volume 141, 10th International Conference on Interactive Theorem Proving (ITP 2019)


Abstract
There are useful programs that do not terminate, and yet standard Hoare logics are not able to prove liveness properties about non-terminating programs. This paper shows how a Hoare-like programming logic framework (characteristic formulae) can be extended to enable reasoning about the I/O behaviour of programs that do not terminate. The approach is inspired by transfinite induction rather than coinduction, and does not require non-terminating loops to be productive. This work has been developed in the HOL4 theorem prover and has been integrated into the ecosystem of proof tools surrounding the CakeML programming language.

Cite as

Johannes Åman Pohjola, Henrik Rostedt, and Magnus O. Myreen. Characteristic Formulae for Liveness Properties of Non-Terminating CakeML Programs. In 10th International Conference on Interactive Theorem Proving (ITP 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 141, pp. 32:1-32:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{amanpohjola_et_al:LIPIcs.ITP.2019.32,
  author =	{\r{A}man Pohjola, Johannes and Rostedt, Henrik and Myreen, Magnus O.},
  title =	{{Characteristic Formulae for Liveness Properties of Non-Terminating CakeML Programs}},
  booktitle =	{10th International Conference on Interactive Theorem Proving (ITP 2019)},
  pages =	{32:1--32:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-122-1},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{141},
  editor =	{Harrison, John and O'Leary, John and Tolmach, Andrew},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2019.32},
  URN =		{urn:nbn:de:0030-drops-110872},
  doi =		{10.4230/LIPIcs.ITP.2019.32},
  annote =	{Keywords: Program verification, non-termination, liveness, Hoare logic}
}
Document
Complete Non-Orders and Fixed Points

Authors: Akihisa Yamada and Jérémy Dubut

Published in: LIPIcs, Volume 141, 10th International Conference on Interactive Theorem Proving (ITP 2019)


Abstract
In this paper, we develop an Isabelle/HOL library of order-theoretic concepts, such as various completeness conditions and fixed-point theorems. We keep our formalization as general as possible: we reprove several well-known results about complete orders, often without any property of ordering, thus complete non-orders. In particular, we generalize the Knaster - Tarski theorem so that we ensure the existence of a quasi-fixed point of monotone maps over complete non-orders, and show that the set of quasi-fixed points is complete under a mild condition - attractivity - which is implied by either antisymmetry or transitivity. This result generalizes and strengthens a result by Stauti and Maaden. Finally, we recover Kleene’s fixed-point theorem for omega-complete non-orders, again using attractivity to prove that Kleene’s fixed points are least quasi-fixed points.

Cite as

Akihisa Yamada and Jérémy Dubut. Complete Non-Orders and Fixed Points. In 10th International Conference on Interactive Theorem Proving (ITP 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 141, pp. 30:1-30:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{yamada_et_al:LIPIcs.ITP.2019.30,
  author =	{Yamada, Akihisa and Dubut, J\'{e}r\'{e}my},
  title =	{{Complete Non-Orders and Fixed Points}},
  booktitle =	{10th International Conference on Interactive Theorem Proving (ITP 2019)},
  pages =	{30:1--30:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-122-1},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{141},
  editor =	{Harrison, John and O'Leary, John and Tolmach, Andrew},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2019.30},
  URN =		{urn:nbn:de:0030-drops-110852},
  doi =		{10.4230/LIPIcs.ITP.2019.30},
  annote =	{Keywords: Order Theory, Lattice Theory, Fixed-Points, Isabelle/HOL}
}
Document
Topological Data Analysis Reveals Principles of Chromosome Structure in Cellular Differentiation

Authors: Natalie Sauerwald, Yihang Shen, and Carl Kingsford

Published in: LIPIcs, Volume 143, 19th International Workshop on Algorithms in Bioinformatics (WABI 2019)


Abstract
Topological data analysis (TDA) is a mathematically well-founded set of methods to derive robust information about the structure and topology of data. It has been applied successfully in several biological contexts. Derived primarily from algebraic topology, TDA rigorously identifies persistent features in complex data, making it well-suited to better understand the key features of three-dimensional chromosome structure. Chromosome structure has a significant influence in many diverse genomic processes and has recently been shown to relate to cellular differentiation. While there exist many methods to study specific substructures of chromosomes, we are still missing a global view of all geometric features of chromosomes. By applying TDA to the study of chromosome structure through differentiation across three cell lines, we provide insight into principles of chromosome folding and looping. We identify persistent connected components and one-dimensional topological features of chromosomes and characterize them across cell types and stages of differentiation. Availability: Scripts to reproduce the results from this study can be found at https://github.com/Kingsford-Group/hictda

Cite as

Natalie Sauerwald, Yihang Shen, and Carl Kingsford. Topological Data Analysis Reveals Principles of Chromosome Structure in Cellular Differentiation. In 19th International Workshop on Algorithms in Bioinformatics (WABI 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 143, pp. 23:1-23:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{sauerwald_et_al:LIPIcs.WABI.2019.23,
  author =	{Sauerwald, Natalie and Shen, Yihang and Kingsford, Carl},
  title =	{{Topological Data Analysis Reveals Principles of Chromosome Structure in Cellular Differentiation}},
  booktitle =	{19th International Workshop on Algorithms in Bioinformatics (WABI 2019)},
  pages =	{23:1--23:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-123-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{143},
  editor =	{Huber, Katharina T. and Gusfield, Dan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.WABI.2019.23},
  URN =		{urn:nbn:de:0030-drops-110537},
  doi =		{10.4230/LIPIcs.WABI.2019.23},
  annote =	{Keywords: topological data analysis, chromosome structure, Hi-C, topologically associating domains}
}
Document
Synteny Paths for Assembly Graphs Comparison

Authors: Evgeny Polevikov and Mikhail Kolmogorov

Published in: LIPIcs, Volume 143, 19th International Workshop on Algorithms in Bioinformatics (WABI 2019)


Abstract
Despite the recent developments of long-read sequencing technologies, it is still difficult to produce complete assemblies of eukaryotic genomes in an automated fashion. Genome assembly software typically output assembled fragments (contigs) along with assembly graphs, that encode all possible layouts of these contigs. Graph representation of the assembled genome can be useful for gene discovery, haplotyping, structural variations analysis and other applications. To facilitate the development of new graph-based approaches, it is important to develop algorithms for comparison and evaluation of assembly graphs produced by different software. In this work, we introduce synteny paths: maximal paths of homologous sequence between the compared assembly graphs. We describe Asgan - an algorithm for efficient synteny paths decomposition, and use it to evaluate assembly graphs of various bacterial assemblies produced by different approaches. We then apply Asgan to discover structural variations between the assemblies of 15 Drosophila genomes, and show that synteny paths are robust to contig fragmentation. The Asgan tool is freely available at: https://github.com/epolevikov/Asgan.

Cite as

Evgeny Polevikov and Mikhail Kolmogorov. Synteny Paths for Assembly Graphs Comparison. In 19th International Workshop on Algorithms in Bioinformatics (WABI 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 143, pp. 24:1-24:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{polevikov_et_al:LIPIcs.WABI.2019.24,
  author =	{Polevikov, Evgeny and Kolmogorov, Mikhail},
  title =	{{Synteny Paths for Assembly Graphs Comparison}},
  booktitle =	{19th International Workshop on Algorithms in Bioinformatics (WABI 2019)},
  pages =	{24:1--24:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-123-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{143},
  editor =	{Huber, Katharina T. and Gusfield, Dan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.WABI.2019.24},
  URN =		{urn:nbn:de:0030-drops-110545},
  doi =		{10.4230/LIPIcs.WABI.2019.24},
  annote =	{Keywords: Assembly graphs, Genome assembly, Synteny blocks, Comparative Genomics}
}
Document
Domination Above r-Independence: Does Sparseness Help?

Authors: Carl Einarson and Felix Reidl

Published in: LIPIcs, Volume 138, 44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019)


Abstract
Inspired by the potential of improving tractability via gap- or above-guarantee parametrisations, we investigate the complexity of Dominating Set when given a suitable lower-bound witness. Concretely, we consider being provided with a maximal r-independent set X (a set in which all vertices have pairwise distance at least r+1) along the input graph G which, for r >= 2, lower-bounds the minimum size of any dominating set of G. In the spirit of gap-parameters, we consider a parametrisation by the size of the "residual" set R := V(G) \ N[X]. Our work aims to answer two questions: How does the constant r affect the tractability of the problem and does the restriction to sparse graph classes help here? For the base case r = 2, we find that the problem is paraNP-complete even in apex- and bounded-degree graphs. For r = 3, the problem is W[2]-hard for general graphs but in FPT for nowhere dense classes and it admits a linear kernel for bounded expansion classes. For r >= 4, the parametrisation becomes essentially equivalent to the natural parameter, the size of the dominating set.

Cite as

Carl Einarson and Felix Reidl. Domination Above r-Independence: Does Sparseness Help?. In 44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 138, pp. 40:1-40:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{einarson_et_al:LIPIcs.MFCS.2019.40,
  author =	{Einarson, Carl and Reidl, Felix},
  title =	{{Domination Above r-Independence: Does Sparseness Help?}},
  booktitle =	{44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019)},
  pages =	{40:1--40:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-117-7},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{138},
  editor =	{Rossmanith, Peter and Heggernes, Pinar and Katoen, Joost-Pieter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2019.40},
  URN =		{urn:nbn:de:0030-drops-109840},
  doi =		{10.4230/LIPIcs.MFCS.2019.40},
  annote =	{Keywords: Dominating Set, Above Guarantee, Kernel, Bounded Expansion, Nowhere Dense}
}
Document
Invited Talk
Trustworthy Graph Algorithms (Invited Talk)

Authors: Mohammad Abdulaziz, Kurt Mehlhorn, and Tobias Nipkow

Published in: LIPIcs, Volume 138, 44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019)


Abstract
The goal of the LEDA project was to build an easy-to-use and extendable library of correct and efficient data structures, graph algorithms and geometric algorithms. We report on the use of formal program verification to achieve an even higher level of trustworthiness. Specifically, we report on an ongoing and largely finished verification of the blossom-shrinking algorithm for maximum cardinality matching.

Cite as

Mohammad Abdulaziz, Kurt Mehlhorn, and Tobias Nipkow. Trustworthy Graph Algorithms (Invited Talk). In 44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 138, pp. 1:1-1:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{abdulaziz_et_al:LIPIcs.MFCS.2019.1,
  author =	{Abdulaziz, Mohammad and Mehlhorn, Kurt and Nipkow, Tobias},
  title =	{{Trustworthy Graph Algorithms}},
  booktitle =	{44th International Symposium on Mathematical Foundations of Computer Science (MFCS 2019)},
  pages =	{1:1--1:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-117-7},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{138},
  editor =	{Rossmanith, Peter and Heggernes, Pinar and Katoen, Joost-Pieter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2019.1},
  URN =		{urn:nbn:de:0030-drops-109456},
  doi =		{10.4230/LIPIcs.MFCS.2019.1},
  annote =	{Keywords: graph algorithms, formal correct proofs, Isabelle, LEDA, certifying algorithms}
}
Document
Timed Basic Parallel Processes

Authors: Lorenzo Clemente, Piotr Hofman, and Patrick Totzke

Published in: LIPIcs, Volume 140, 30th International Conference on Concurrency Theory (CONCUR 2019)


Abstract
Timed basic parallel processes (TBPP) extend communication-free Petri nets (aka. BPP or commutative context-free grammars) by a global notion of time. TBPP can be seen as an extension of timed automata (TA) with context-free branching rules, and as such may be used to model networks of independent timed automata with process creation. We show that the coverability and reachability problems (with unary encoded target multiplicities) are PSPACE-complete and EXPTIME-complete, respectively. For the special case of 1-clock TBPP, both are NP-complete and hence not more complex than for untimed BPP. This contrasts with known super-Ackermannian-completeness and undecidability results for general timed Petri nets. As a result of independent interest, and basis for our NP upper bounds, we show that the reachability relation of 1-clock TA can be expressed by a formula of polynomial size in the existential fragment of linear arithmetic, which improves on recent results from the literature.

Cite as

Lorenzo Clemente, Piotr Hofman, and Patrick Totzke. Timed Basic Parallel Processes. In 30th International Conference on Concurrency Theory (CONCUR 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 140, pp. 15:1-15:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{clemente_et_al:LIPIcs.CONCUR.2019.15,
  author =	{Clemente, Lorenzo and Hofman, Piotr and Totzke, Patrick},
  title =	{{Timed Basic Parallel Processes}},
  booktitle =	{30th International Conference on Concurrency Theory (CONCUR 2019)},
  pages =	{15:1--15:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-121-4},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{140},
  editor =	{Fokkink, Wan and van Glabbeek, Rob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CONCUR.2019.15},
  URN =		{urn:nbn:de:0030-drops-109171},
  doi =		{10.4230/LIPIcs.CONCUR.2019.15},
  annote =	{Keywords: Timed Automata, Petri Nets}
}
Document
The Dynamic Practice and Static Theory of Gradual Typing

Authors: Michael Greenberg

Published in: LIPIcs, Volume 136, 3rd Summit on Advances in Programming Languages (SNAPL 2019)


Abstract
We can tease apart the research on gradual types into two `lineages': a pragmatic, implementation-oriented dynamic-first lineage and a formal, type-theoretic, static-first lineage. The dynamic-first lineage’s focus is on taming particular idioms - `pre-existing conditions' in untyped programming languages. The static-first lineage’s focus is on interoperation and individual type system features, rather than the collection of features found in any particular language. Both appear in programming languages research under the name "gradual typing", and they are in active conversation with each other. What are these two lineages? What challenges and opportunities await the static-first lineage? What progress has been made so far?

Cite as

Michael Greenberg. The Dynamic Practice and Static Theory of Gradual Typing. In 3rd Summit on Advances in Programming Languages (SNAPL 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 136, pp. 6:1-6:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{greenberg:LIPIcs.SNAPL.2019.6,
  author =	{Greenberg, Michael},
  title =	{{The Dynamic Practice and Static Theory of Gradual Typing}},
  booktitle =	{3rd Summit on Advances in Programming Languages (SNAPL 2019)},
  pages =	{6:1--6:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-113-9},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{136},
  editor =	{Lerner, Benjamin S. and Bod{\'\i}k, Rastislav and Krishnamurthi, Shriram},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SNAPL.2019.6},
  URN =		{urn:nbn:de:0030-drops-105495},
  doi =		{10.4230/LIPIcs.SNAPL.2019.6},
  annote =	{Keywords: dynamic typing, gradual typing, static typing, implementation, theory, challenge problems}
}
Document
Blame Tracking and Type Error Debugging

Authors: Sheng Chen and John Peter Campora III

Published in: LIPIcs, Volume 136, 3rd Summit on Advances in Programming Languages (SNAPL 2019)


Abstract
In this work, we present an unexpected connection between gradual typing and type error debugging. Namely, we illustrate that gradual typing provides a natural way to defer type errors in statically ill-typed programs, providing more feedback than traditional approaches to deferring type errors. When evaluating expressions that lead to runtime type errors, the usefulness of the feedback depends on blame tracking, the defacto approach to locating the cause of such runtime type errors. Unfortunately, blame tracking suffers from the bias problem for type error localization in languages with type inference. We illustrate and formalize the bias problem for blame tracking, present ideas for adapting existing type error debugging techniques to combat this bias, and outline further challenges.

Cite as

Sheng Chen and John Peter Campora III. Blame Tracking and Type Error Debugging. In 3rd Summit on Advances in Programming Languages (SNAPL 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 136, pp. 2:1-2:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.SNAPL.2019.2,
  author =	{Chen, Sheng and Campora III, John Peter},
  title =	{{Blame Tracking and Type Error Debugging}},
  booktitle =	{3rd Summit on Advances in Programming Languages (SNAPL 2019)},
  pages =	{2:1--2:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-113-9},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{136},
  editor =	{Lerner, Benjamin S. and Bod{\'\i}k, Rastislav and Krishnamurthi, Shriram},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SNAPL.2019.2},
  URN =		{urn:nbn:de:0030-drops-105451},
  doi =		{10.4230/LIPIcs.SNAPL.2019.2},
  annote =	{Keywords: Blame tracking, type error debugging, gradual typing, type inference}
}
Document
From Theory to Systems: A Grounded Approach to Programming Language Education

Authors: Will Crichton

Published in: LIPIcs, Volume 136, 3rd Summit on Advances in Programming Languages (SNAPL 2019)


Abstract
I present a new approach to teaching a graduate-level programming languages course focused on using systems programming ideas and languages like WebAssembly and Rust to motivate PL theory. Drawing on students' prior experience with low-level languages, the course shows how type systems and PL theory are used to avoid tricky real-world errors that students encounter in practice. I reflect on the curricular design and lessons learned from two years of teaching at Stanford, showing that integrating systems ideas can provide students a more grounded and enjoyable education in programming languages. The curriculum, course notes, and assignments are freely available: http://cs242.stanford.edu/f18/

Cite as

Will Crichton. From Theory to Systems: A Grounded Approach to Programming Language Education. In 3rd Summit on Advances in Programming Languages (SNAPL 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 136, pp. 4:1-4:9, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{crichton:LIPIcs.SNAPL.2019.4,
  author =	{Crichton, Will},
  title =	{{From Theory to Systems: A Grounded Approach to Programming Language Education}},
  booktitle =	{3rd Summit on Advances in Programming Languages (SNAPL 2019)},
  pages =	{4:1--4:9},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-113-9},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{136},
  editor =	{Lerner, Benjamin S. and Bod{\'\i}k, Rastislav and Krishnamurthi, Shriram},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SNAPL.2019.4},
  URN =		{urn:nbn:de:0030-drops-105472},
  doi =		{10.4230/LIPIcs.SNAPL.2019.4},
  annote =	{Keywords: programming languages, programming language education}
}
Document
The Next 700 Semantics: A Research Challenge

Authors: Shriram Krishnamurthi, Benjamin S. Lerner, and Liam Elberty

Published in: LIPIcs, Volume 136, 3rd Summit on Advances in Programming Languages (SNAPL 2019)


Abstract
Modern systems consist of large numbers of languages, frameworks, libraries, APIs, and more. Each has characteristic behavior and data. Capturing these in semantics is valuable not only for understanding them but also essential for formal treatment (such as proofs). Unfortunately, most of these systems are defined primarily through implementations, which means the semantics needs to be learned. We describe the problem of learning a semantics, provide a structuring process that is of potential value, and also outline our failed attempts at achieving this so far.

Cite as

Shriram Krishnamurthi, Benjamin S. Lerner, and Liam Elberty. The Next 700 Semantics: A Research Challenge. In 3rd Summit on Advances in Programming Languages (SNAPL 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 136, pp. 9:1-9:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{krishnamurthi_et_al:LIPIcs.SNAPL.2019.9,
  author =	{Krishnamurthi, Shriram and Lerner, Benjamin S. and Elberty, Liam},
  title =	{{The Next 700 Semantics: A Research Challenge}},
  booktitle =	{3rd Summit on Advances in Programming Languages (SNAPL 2019)},
  pages =	{9:1--9:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-113-9},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{136},
  editor =	{Lerner, Benjamin S. and Bod{\'\i}k, Rastislav and Krishnamurthi, Shriram},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SNAPL.2019.9},
  URN =		{urn:nbn:de:0030-drops-105522},
  doi =		{10.4230/LIPIcs.SNAPL.2019.9},
  annote =	{Keywords: Programming languages, desugaring, semantics, testing}
}
  • Refine by Author
  • 2 Marr, Stefan
  • 1 Abdulaziz, Mohammad
  • 1 Ahmadi, Mohamad
  • 1 Ahmed, Amal
  • 1 Amin, Nada
  • Show More...

  • Refine by Classification
  • 2 Software and its engineering → Language features
  • 2 Theory of computation → Interactive proof systems
  • 2 Theory of computation → Logic and verification
  • 2 Theory of computation → Separation logic
  • 1 Applied computing
  • Show More...

  • Refine by Keyword
  • 5 gradual typing
  • 2 Isabelle/HOL
  • 2 relational semantics
  • 2 semantics
  • 1 3SUM
  • Show More...

  • Refine by Type
  • 38 document

  • Refine by Publication Year
  • 17 2015
  • 15 2019
  • 2 2014
  • 2 2020
  • 1 2016
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail