11 Search Results for "Wang, Kai"


Document
Improving Local Search for Minimum Weighted Connected Dominating Set Problem by Inner-Layer Local Search

Authors: Bohan Li, Kai Wang, Yiyuan Wang, and Shaowei Cai

Published in: LIPIcs, Volume 210, 27th International Conference on Principles and Practice of Constraint Programming (CP 2021)


Abstract
The minimum weighted connected dominating set (MWCDS) problem is an important variant of connected dominating set problems with wide applications, especially in heterogenous networks and gene regulatory networks. In the paper, we develop a nested local search algorithm called NestedLS for solving MWCDS on classic benchmarks and massive graphs. In this local search framework, we propose two novel ideas to make it effective by utilizing previous search information. First, we design the restart based smoothing mechanism as a diversification method to escape from local optimal. Second, we propose a novel inner-layer local search method to enlarge the candidate removal set, which can be modelled as an optimized version of spanning tree problem. Moreover, inner-layer local search method is a general method for maintaining the connectivity constraint when dealing with massive graphs. Experimental results show that NestedLS outperforms state-of-the-art meta-heuristic algorithms on most instances.

Cite as

Bohan Li, Kai Wang, Yiyuan Wang, and Shaowei Cai. Improving Local Search for Minimum Weighted Connected Dominating Set Problem by Inner-Layer Local Search. In 27th International Conference on Principles and Practice of Constraint Programming (CP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 210, pp. 39:1-39:16, Schloss Dagstuhl - Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{li_et_al:LIPIcs.CP.2021.39,
  author =	{Li, Bohan and Wang, Kai and Wang, Yiyuan and Cai, Shaowei},
  title =	{{Improving Local Search for Minimum Weighted Connected Dominating Set Problem by Inner-Layer Local Search}},
  booktitle =	{27th International Conference on Principles and Practice of Constraint Programming (CP 2021)},
  pages =	{39:1--39:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-211-2},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{210},
  editor =	{Michel, Laurent D.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CP.2021.39},
  URN =		{urn:nbn:de:0030-drops-153304},
  doi =		{10.4230/LIPIcs.CP.2021.39},
  annote =	{Keywords: Operations Research, NP-hard Problem, Local Search, Weighted Connected Dominating Set Problem}
}
Document
Short Paper
Hammering Mizar by Learning Clause Guidance (Short Paper)

Authors: Jan Jakubův and Josef Urban

Published in: LIPIcs, Volume 141, 10th International Conference on Interactive Theorem Proving (ITP 2019)


Abstract
We describe a very large improvement of existing hammer-style proof automation over large ITP libraries by combining learning and theorem proving. In particular, we have integrated state-of-the-art machine learners into the E automated theorem prover, and developed methods that allow learning and efficient internal guidance of E over the whole Mizar library. The resulting trained system improves the real-time performance of E on the Mizar library by 70% in a single-strategy setting.

Cite as

Jan Jakubův and Josef Urban. Hammering Mizar by Learning Clause Guidance (Short Paper). In 10th International Conference on Interactive Theorem Proving (ITP 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 141, pp. 34:1-34:8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{jakubuv_et_al:LIPIcs.ITP.2019.34,
  author =	{Jakub\r{u}v, Jan and Urban, Josef},
  title =	{{Hammering Mizar by Learning Clause Guidance}},
  booktitle =	{10th International Conference on Interactive Theorem Proving (ITP 2019)},
  pages =	{34:1--34:8},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-122-1},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{141},
  editor =	{Harrison, John and O'Leary, John and Tolmach, Andrew},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2019.34},
  URN =		{urn:nbn:de:0030-drops-110898},
  doi =		{10.4230/LIPIcs.ITP.2019.34},
  annote =	{Keywords: Proof automation, ITP hammers, Automated theorem proving, Machine learning}
}
Document
Verifying That a Compiler Preserves Concurrent Value-Dependent Information-Flow Security

Authors: Robert Sison and Toby Murray

Published in: LIPIcs, Volume 141, 10th International Conference on Interactive Theorem Proving (ITP 2019)


Abstract
It is common to prove by reasoning over source code that programs do not leak sensitive data. But doing so leaves a gap between reasoning and reality that can only be filled by accounting for the behaviour of the compiler. This task is complicated when programs enforce value-dependent information-flow security properties (in which classification of locations can vary depending on values in other locations) and complicated further when programs exploit shared-variable concurrency. Prior work has formally defined a notion of concurrency-aware refinement for preserving value-dependent security properties. However, that notion is considerably more complex than standard refinement definitions typically applied in the verification of semantics preservation by compilers. To date it remains unclear whether it can be applied to a realistic compiler, because there exist no general decomposition principles for separating it into smaller, more familiar, proof obligations. In this work, we provide such a decomposition principle, which we show can almost halve the complexity of proving secure refinement. Further, we demonstrate its applicability to secure compilation, by proving in Isabelle/HOL the preservation of value-dependent security by a proof-of-concept compiler from an imperative While language to a generic RISC-style assembly language, for programs with shared-memory concurrency mediated by locking primitives. Finally, we execute our compiler in Isabelle on a While language model of the Cross Domain Desktop Compositor, demonstrating to our knowledge the first use of a compiler verification result to carry an information-flow security property down to the assembly-level model of a non-trivial concurrent program.

Cite as

Robert Sison and Toby Murray. Verifying That a Compiler Preserves Concurrent Value-Dependent Information-Flow Security. In 10th International Conference on Interactive Theorem Proving (ITP 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 141, pp. 27:1-27:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{sison_et_al:LIPIcs.ITP.2019.27,
  author =	{Sison, Robert and Murray, Toby},
  title =	{{Verifying That a Compiler Preserves Concurrent Value-Dependent Information-Flow Security}},
  booktitle =	{10th International Conference on Interactive Theorem Proving (ITP 2019)},
  pages =	{27:1--27:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-122-1},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{141},
  editor =	{Harrison, John and O'Leary, John and Tolmach, Andrew},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ITP.2019.27},
  URN =		{urn:nbn:de:0030-drops-110829},
  doi =		{10.4230/LIPIcs.ITP.2019.27},
  annote =	{Keywords: Secure compilation, Information flow security, Concurrency, Verification}
}
Document
Vision Paper
The Future of Geographic Information Displays from GIScience, Cartographic, and Cognitive Science Perspectives (Vision Paper)

Authors: Tyler Thrash, Sara Lanini-Maggi, Sara I. Fabrikant, Sven Bertel, Annina Brügger, Sascha Credé, Cao Tri Do, Georg Gartner, Haosheng Huang, Stefan Münzer, and Kai-Florian Richter

Published in: LIPIcs, Volume 142, 14th International Conference on Spatial Information Theory (COSIT 2019)


Abstract
With the development of modern geovisual analytics tools, several researchers have emphasized the importance of understanding users' cognitive, perceptual, and affective tendencies for supporting spatial decisions with geographic information displays (GIDs). However, most recent technological developments have focused on support for navigation in terms of efficiency and effectiveness while neglecting the importance of spatial learning. In the present paper, we will envision the future of GIDs that also support spatial learning in the context of large-scale navigation. Specifically, we will illustrate the manner in which GIDs have been (in the past) and might be (in the future) designed to be context-responsive, personalized, and supportive for active spatial learning from three different perspectives (i.e., GIScience, cartography, and cognitive science). We will also explain why this approach is essential for preventing the technological infantilizing of society (i.e., the reduction of our capacity to make decisions without technological assistance). Although these issues are common to nearly all emerging digital technologies, we argue that these issues become especially relevant in consideration of a person’s current and future locations.

Cite as

Tyler Thrash, Sara Lanini-Maggi, Sara I. Fabrikant, Sven Bertel, Annina Brügger, Sascha Credé, Cao Tri Do, Georg Gartner, Haosheng Huang, Stefan Münzer, and Kai-Florian Richter. The Future of Geographic Information Displays from GIScience, Cartographic, and Cognitive Science Perspectives (Vision Paper). In 14th International Conference on Spatial Information Theory (COSIT 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 142, pp. 19:1-19:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{thrash_et_al:LIPIcs.COSIT.2019.19,
  author =	{Thrash, Tyler and Lanini-Maggi, Sara and Fabrikant, Sara I. and Bertel, Sven and Br\"{u}gger, Annina and Cred\'{e}, Sascha and Do, Cao Tri and Gartner, Georg and Huang, Haosheng and M\"{u}nzer, Stefan and Richter, Kai-Florian},
  title =	{{The Future of Geographic Information Displays from GIScience, Cartographic, and Cognitive Science Perspectives}},
  booktitle =	{14th International Conference on Spatial Information Theory (COSIT 2019)},
  pages =	{19:1--19:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-115-3},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{142},
  editor =	{Timpf, Sabine and Schlieder, Christoph and Kattenbeck, Markus and Ludwig, Bernd and Stewart, Kathleen},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.COSIT.2019.19},
  URN =		{urn:nbn:de:0030-drops-111113},
  doi =		{10.4230/LIPIcs.COSIT.2019.19},
  annote =	{Keywords: visual displays, geographic information, cartography, cognitive science}
}
Document
Finding All Maximal Perfect Haplotype Blocks in Linear Time

Authors: Jarno Alanko, Hideo Bannai, Bastien Cazaux, Pierre Peterlongo, and Jens Stoye

Published in: LIPIcs, Volume 143, 19th International Workshop on Algorithms in Bioinformatics (WABI 2019)


Abstract
Recent large-scale community sequencing efforts allow at an unprecedented level of detail the identification of genomic regions that show signatures of natural selection. Traditional methods for identifying such regions from individuals' haplotype data, however, require excessive computing times and therefore are not applicable to current datasets. In 2019, Cunha et al. (Proceedings of BSB 2019) suggested the maximal perfect haplotype block as a very simple combinatorial pattern, forming the basis of a new method to perform rapid genome-wide selection scans. The algorithm they presented for identifying these blocks, however, had a worst-case running time quadratic in the genome length. It was posed as an open problem whether an optimal, linear-time algorithm exists. In this paper we give two algorithms that achieve this time bound, one conceptually very simple one using suffix trees and a second one using the positional Burrows-Wheeler Transform, that is very efficient also in practice.

Cite as

Jarno Alanko, Hideo Bannai, Bastien Cazaux, Pierre Peterlongo, and Jens Stoye. Finding All Maximal Perfect Haplotype Blocks in Linear Time. In 19th International Workshop on Algorithms in Bioinformatics (WABI 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 143, pp. 8:1-8:9, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{alanko_et_al:LIPIcs.WABI.2019.8,
  author =	{Alanko, Jarno and Bannai, Hideo and Cazaux, Bastien and Peterlongo, Pierre and Stoye, Jens},
  title =	{{Finding All Maximal Perfect Haplotype Blocks in Linear Time}},
  booktitle =	{19th International Workshop on Algorithms in Bioinformatics (WABI 2019)},
  pages =	{8:1--8:9},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-123-8},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{143},
  editor =	{Huber, Katharina T. and Gusfield, Dan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.WABI.2019.8},
  URN =		{urn:nbn:de:0030-drops-110388},
  doi =		{10.4230/LIPIcs.WABI.2019.8},
  annote =	{Keywords: Population genomics, selection coefficient, haplotype block, positional Burrows-Wheeler Transform}
}
Document
Invited Paper
Safety Verification for Deep Neural Networks with Provable Guarantees (Invited Paper)

Authors: Marta Z. Kwiatkowska

Published in: LIPIcs, Volume 140, 30th International Conference on Concurrency Theory (CONCUR 2019)


Abstract
Computing systems are becoming ever more complex, increasingly often incorporating deep learning components. Since deep learning is unstable with respect to adversarial perturbations, there is a need for rigorous software development methodologies that encompass machine learning. This paper describes progress with developing automated verification techniques for deep neural networks to ensure safety and robustness of their decisions with respect to input perturbations. This includes novel algorithms based on feature-guided search, games, global optimisation and Bayesian methods.

Cite as

Marta Z. Kwiatkowska. Safety Verification for Deep Neural Networks with Provable Guarantees (Invited Paper). In 30th International Conference on Concurrency Theory (CONCUR 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 140, pp. 1:1-1:5, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{kwiatkowska:LIPIcs.CONCUR.2019.1,
  author =	{Kwiatkowska, Marta Z.},
  title =	{{Safety Verification for Deep Neural Networks with Provable Guarantees}},
  booktitle =	{30th International Conference on Concurrency Theory (CONCUR 2019)},
  pages =	{1:1--1:5},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-121-4},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{140},
  editor =	{Fokkink, Wan and van Glabbeek, Rob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CONCUR.2019.1},
  URN =		{urn:nbn:de:0030-drops-109036},
  doi =		{10.4230/LIPIcs.CONCUR.2019.1},
  annote =	{Keywords: Neural networks, robustness, formal verification, Bayesian neural networks}
}
Document
Overparameterization: A Connection Between Software 1.0 and Software 2.0

Authors: Michael Carbin

Published in: LIPIcs, Volume 136, 3rd Summit on Advances in Programming Languages (SNAPL 2019)


Abstract
A new ecosystem of machine-learning driven applications, titled Software 2.0, has arisen that integrates neural networks into a variety of computational tasks. Such applications include image recognition, natural language processing, and other traditional machine learning tasks. However, these techniques have also grown to include other structured domains, such as program analysis and program optimization for which novel, domain-specific insights mate with model design. In this paper, we connect the world of Software 2.0 with that of traditional software - Software 1.0 - through overparameterization: a program may provide more computational capacity and precision than is necessary for the task at hand. In Software 2.0, overparamterization - when a machine learning model has more parameters than datapoints in the dataset - arises as a contemporary understanding of the ability for modern, gradient-based learning methods to learn models over complex datasets with high-accuracy. Specifically, the more parameters a model has, the better it learns. In Software 1.0, the results of the approximate computing community show that traditional software is also overparameterized in that software often simply computes results that are more precise than is required by the user. Approximate computing exploits this overparameterization to improve performance by eliminating unnecessary, excess computation. For example, one - of many techniques - is to reduce the precision of arithmetic in the application. In this paper, we argue that the gap between available precision and that that is required for either Software 1.0 or Software 2.0 is a fundamental aspect of software design that illustrates the balance between software designed for general-purposes and domain-adapted solutions. A general-purpose solution is easier to develop and maintain versus a domain-adapted solution. However, that ease comes at the expense of performance. We show that the approximate computing community and the machine learning community have developed overlapping techniques to improve performance by reducing overparameterization. We also show that because of these shared techniques, questions, concerns, and answers on how to construct software can translate from one software variant to the other.

Cite as

Michael Carbin. Overparameterization: A Connection Between Software 1.0 and Software 2.0. In 3rd Summit on Advances in Programming Languages (SNAPL 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 136, pp. 1:1-1:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{carbin:LIPIcs.SNAPL.2019.1,
  author =	{Carbin, Michael},
  title =	{{Overparameterization: A Connection Between Software 1.0 and Software 2.0}},
  booktitle =	{3rd Summit on Advances in Programming Languages (SNAPL 2019)},
  pages =	{1:1--1:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-113-9},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{136},
  editor =	{Lerner, Benjamin S. and Bod{\'\i}k, Rastislav and Krishnamurthi, Shriram},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SNAPL.2019.1},
  URN =		{urn:nbn:de:0030-drops-105440},
  doi =		{10.4230/LIPIcs.SNAPL.2019.1},
  annote =	{Keywords: Approximate Computing, Machine Learning, Software 2.0}
}
Document
An FPTAS of Minimizing Total Weighted Completion Time on Single Machine with Position Constraint

Authors: Gruia Calinescu, Florian Jaehn, Minming Li, and Kai Wang

Published in: LIPIcs, Volume 92, 28th International Symposium on Algorithms and Computation (ISAAC 2017)


Abstract
In this paper we study the classical scheduling problem of minimizing the total weighted completion time on a single machine with the constraint that one specific job must be scheduled at a specified position. We give dynamic programs with pseudo-polynomial running time, and a fully polynomial-time approximation scheme (FPTAS).

Cite as

Gruia Calinescu, Florian Jaehn, Minming Li, and Kai Wang. An FPTAS of Minimizing Total Weighted Completion Time on Single Machine with Position Constraint. In 28th International Symposium on Algorithms and Computation (ISAAC 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 92, pp. 19:1-19:13, Schloss Dagstuhl - Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{calinescu_et_al:LIPIcs.ISAAC.2017.19,
  author =	{Calinescu, Gruia and Jaehn, Florian and Li, Minming and Wang, Kai},
  title =	{{An FPTAS of Minimizing Total Weighted Completion Time on Single Machine with Position Constraint}},
  booktitle =	{28th International Symposium on Algorithms and Computation (ISAAC 2017)},
  pages =	{19:1--19:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-054-5},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{92},
  editor =	{Okamoto, Yoshio and Tokuyama, Takeshi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2017.19},
  URN =		{urn:nbn:de:0030-drops-82335},
  doi =		{10.4230/LIPIcs.ISAAC.2017.19},
  annote =	{Keywords: FPTAS, Scheduling, Approximation Algorithm}
}
Document
A 2-Competitive Algorithm For Online Convex Optimization With Switching Costs

Authors: Nikhil Bansal, Anupam Gupta, Ravishankar Krishnaswamy, Kirk Pruhs, Kevin Schewior, and Cliff Stein

Published in: LIPIcs, Volume 40, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015)


Abstract
We consider a natural online optimization problem set on the real line. The state of the online algorithm at each integer time is a location on the real line. At each integer time, a convex function arrives online. In response, the online algorithm picks a new location. The cost paid by the online algorithm for this response is the distance moved plus the value of the function at the final destination. The objective is then to minimize the aggregate cost over all time. The motivating application is rightsizing power-proportional data centers. We give a 2-competitive algorithm for this problem. We also give a 3-competitive memoryless algorithm, and show that this is the best competitive ratio achievable by a deterministic memoryless algorithm. Finally we show that this online problem is strictly harder than the standard ski rental problem.

Cite as

Nikhil Bansal, Anupam Gupta, Ravishankar Krishnaswamy, Kirk Pruhs, Kevin Schewior, and Cliff Stein. A 2-Competitive Algorithm For Online Convex Optimization With Switching Costs. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 40, pp. 96-109, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{bansal_et_al:LIPIcs.APPROX-RANDOM.2015.96,
  author =	{Bansal, Nikhil and Gupta, Anupam and Krishnaswamy, Ravishankar and Pruhs, Kirk and Schewior, Kevin and Stein, Cliff},
  title =	{{A 2-Competitive Algorithm For Online Convex Optimization With Switching Costs}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015)},
  pages =	{96--109},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-89-7},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{40},
  editor =	{Garg, Naveen and Jansen, Klaus and Rao, Anup and Rolim, Jos\'{e} D. P.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX-RANDOM.2015.96},
  URN =		{urn:nbn:de:0030-drops-52970},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2015.96},
  annote =	{Keywords: Stochastic, Scheduling}
}
Document
Toward a Dependability Case Language and Workflow for a Radiation Therapy System

Authors: Michael D. Ernst, Dan Grossman, Jon Jacky, Calvin Loncaric, Stuart Pernsteiner, Zachary Tatlock, Emina Torlak, and Xi Wang

Published in: LIPIcs, Volume 32, 1st Summit on Advances in Programming Languages (SNAPL 2015)


Abstract
We present a near-future research agenda for bringing a suite of modern programming-languages verification tools - specifically interactive theorem proving, solver-aided languages, and formally defined domain-specific languages - to the development of a specific safety-critical system, a radiotherapy medical device. We sketch how we believe recent programming-languages research advances can merge with existing best practices for safety-critical systems to increase system assurance and developer productivity. We motivate hypotheses central to our agenda: That we should start with a single specific system and that we need to integrate a variety of complementary verification and synthesis tools into system development.

Cite as

Michael D. Ernst, Dan Grossman, Jon Jacky, Calvin Loncaric, Stuart Pernsteiner, Zachary Tatlock, Emina Torlak, and Xi Wang. Toward a Dependability Case Language and Workflow for a Radiation Therapy System. In 1st Summit on Advances in Programming Languages (SNAPL 2015). Leibniz International Proceedings in Informatics (LIPIcs), Volume 32, pp. 103-112, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{ernst_et_al:LIPIcs.SNAPL.2015.103,
  author =	{Ernst, Michael D. and Grossman, Dan and Jacky, Jon and Loncaric, Calvin and Pernsteiner, Stuart and Tatlock, Zachary and Torlak, Emina and Wang, Xi},
  title =	{{Toward a Dependability Case Language and Workflow for a Radiation Therapy System}},
  booktitle =	{1st Summit on Advances in Programming Languages (SNAPL 2015)},
  pages =	{103--112},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-80-4},
  ISSN =	{1868-8969},
  year =	{2015},
  volume =	{32},
  editor =	{Ball, Thomas and Bodík, Rastislav and Krishnamurthi, Shriram and Lerner, Benjamin S. and Morriset, Greg},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.SNAPL.2015.103},
  URN =		{urn:nbn:de:0030-drops-50208},
  doi =		{10.4230/LIPIcs.SNAPL.2015.103},
  annote =	{Keywords: Synthesis, Proof Assistants, Verification, Dependability Cases, Domain Specific Languages, Radiation Therapy}
}
Document
Notes on Counting with Finite Machines

Authors: Dmitry Chistikov

Published in: LIPIcs, Volume 29, 34th International Conference on Foundation of Software Technology and Theoretical Computer Science (FSTTCS 2014)


Abstract
We determine the descriptional complexity (smallest number of states, up to constant factors) of recognizing languages {1^n} and {1^{t n} : t = 0, 1, 2, ...} with state-based finite machines of various kinds. This task is understood as counting to n and modulo n, respectively, and was previously studied for classes of finite-state automata by Kupferman, Ta-Shma, and Vardi (2001). We show that for Turing machines it requires log(n)/log(log(n)) states in the worst case, and individual values are related to Kolmogorov complexity of the binary encoding of n. For deterministic pushdown and counter automata, the complexity is log(n) and sqrt(n), respectively; for alternating counter automata, we show an upper bound of log(n). For visibly pushdown automata, i.e., if the stack movements are determined by input symbols, we consider languages {a^n b^n} and {a^{t n} b^{t n} : n t = 0, 1, 2, ...} and determine their complexity, of sqrt(n) and min(n_1 + n_2), respectively, with minimum over all factorizations n = n_1 n_2.

Cite as

Dmitry Chistikov. Notes on Counting with Finite Machines. In 34th International Conference on Foundation of Software Technology and Theoretical Computer Science (FSTTCS 2014). Leibniz International Proceedings in Informatics (LIPIcs), Volume 29, pp. 339-350, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@InProceedings{chistikov:LIPIcs.FSTTCS.2014.339,
  author =	{Chistikov, Dmitry},
  title =	{{Notes on Counting with Finite Machines}},
  booktitle =	{34th International Conference on Foundation of Software Technology and Theoretical Computer Science (FSTTCS 2014)},
  pages =	{339--350},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-77-4},
  ISSN =	{1868-8969},
  year =	{2014},
  volume =	{29},
  editor =	{Raman, Venkatesh and Suresh, S. P.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2014.339},
  URN =		{urn:nbn:de:0030-drops-48547},
  doi =		{10.4230/LIPIcs.FSTTCS.2014.339},
  annote =	{Keywords: State complexity, Unary languages, Counting}
}
  • Refine by Author
  • 2 Wang, Kai
  • 1 Alanko, Jarno
  • 1 Bannai, Hideo
  • 1 Bansal, Nikhil
  • 1 Bertel, Sven
  • Show More...

  • Refine by Classification
  • 2 Computing methodologies → Machine learning
  • 1 Applied computing → Bioinformatics
  • 1 Applied computing → Computational genomics
  • 1 Applied computing → Operations research
  • 1 Computing methodologies → Neural networks
  • Show More...

  • Refine by Keyword
  • 2 Scheduling
  • 2 Verification
  • 1 Approximate Computing
  • 1 Approximation Algorithm
  • 1 Automated theorem proving
  • Show More...

  • Refine by Type
  • 11 document

  • Refine by Publication Year
  • 6 2019
  • 2 2015
  • 1 2014
  • 1 2017
  • 1 2021

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail