27 Search Results for "Lisper, Bj�rn"


Document
TACLeBench: A Benchmark Collection to Support Worst-Case Execution Time Research

Authors: Heiko Falk, Sebastian Altmeyer, Peter Hellinckx, Björn Lisper, Wolfgang Puffitsch, Christine Rochange, Martin Schoeberl, Rasmus Bo Sørensen, Peter Wägemann, and Simon Wegener

Published in: OASIcs, Volume 55, 16th International Workshop on Worst-Case Execution Time Analysis (WCET 2016)


Abstract
Engineering related research, such as research on worst-case execution time, uses experimentation to evaluate ideas. For these experiments we need example programs. Furthermore, to make the research experimentation repeatable those programs shall be made publicly available. We collected open-source programs, adapted them to a common coding style, and provide the collection in open-source. The benchmark collection is called TACLeBench and is available from GitHub in version 1.9 at the publication date of this paper. One of the main features of TACLeBench is that all programs are self-contained without any dependencies on standard libraries or an operating system.

Cite as

Heiko Falk, Sebastian Altmeyer, Peter Hellinckx, Björn Lisper, Wolfgang Puffitsch, Christine Rochange, Martin Schoeberl, Rasmus Bo Sørensen, Peter Wägemann, and Simon Wegener. TACLeBench: A Benchmark Collection to Support Worst-Case Execution Time Research. In 16th International Workshop on Worst-Case Execution Time Analysis (WCET 2016). Open Access Series in Informatics (OASIcs), Volume 55, pp. 2:1-2:10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{falk_et_al:OASIcs.WCET.2016.2,
  author =	{Falk, Heiko and Altmeyer, Sebastian and Hellinckx, Peter and Lisper, Bj\"{o}rn and Puffitsch, Wolfgang and Rochange, Christine and Schoeberl, Martin and S{\o}rensen, Rasmus Bo and W\"{a}gemann, Peter and Wegener, Simon},
  title =	{{TACLeBench: A Benchmark Collection to Support Worst-Case Execution Time Research}},
  booktitle =	{16th International Workshop on Worst-Case Execution Time Analysis (WCET 2016)},
  pages =	{2:1--2:10},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-025-5},
  ISSN =	{2190-6807},
  year =	{2016},
  volume =	{55},
  editor =	{Schoeberl, Martin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2016.2},
  URN =		{urn:nbn:de:0030-drops-68958},
  doi =		{10.4230/OASIcs.WCET.2016.2},
  annote =	{Keywords: Benchmark, WCET analysis, real-time systems}
}
Document
WCET and Mixed-Criticality: What does Confidence in WCET Estimations Depend Upon?

Authors: Sebastian Altmeyer, Björn Lisper, Claire Maiza, Jan Reineke, and Christine Rochange

Published in: OASIcs, Volume 47, 15th International Workshop on Worst-Case Execution Time Analysis (WCET 2015)


Abstract
Mixed-criticality systems integrate components of different criticality. Different criticality levels require different levels of confidence in the correct behavior of a component. One aspect of correctness is timing. Confidence in worst-case execution time (WCET) estimates depends on the process by which they have been obtained. A somewhat naive view is that static WCET analyses determines safe bounds in which we can have absolute confidence, while measurement-based approaches are inherently unreliable. In this paper, we refine this view by exploring sources of doubt in the correctness of both static and measurement-based WCET analysis.

Cite as

Sebastian Altmeyer, Björn Lisper, Claire Maiza, Jan Reineke, and Christine Rochange. WCET and Mixed-Criticality: What does Confidence in WCET Estimations Depend Upon?. In 15th International Workshop on Worst-Case Execution Time Analysis (WCET 2015). Open Access Series in Informatics (OASIcs), Volume 47, pp. 65-74, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{altmeyer_et_al:OASIcs.WCET.2015.65,
  author =	{Altmeyer, Sebastian and Lisper, Bj\"{o}rn and Maiza, Claire and Reineke, Jan and Rochange, Christine},
  title =	{{WCET and Mixed-Criticality: What does Confidence in WCET Estimations Depend Upon?}},
  booktitle =	{15th International Workshop on Worst-Case Execution Time Analysis (WCET 2015)},
  pages =	{65--74},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-95-8},
  ISSN =	{2190-6807},
  year =	{2015},
  volume =	{47},
  editor =	{Cazorla, Francisco J.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2015.65},
  URN =		{urn:nbn:de:0030-drops-52574},
  doi =		{10.4230/OASIcs.WCET.2015.65},
  annote =	{Keywords: mixed criticality, WCET analysis, confidence in WCET estimates}
}
Document
Analysing Switch-Case Code with Abstract Execution

Authors: Niklas Holsti, Jan Gustafsson, Linus Källberg, and Björn Lisper

Published in: OASIcs, Volume 47, 15th International Workshop on Worst-Case Execution Time Analysis (WCET 2015)


Abstract
Constructing the control-flow graph (CFG) of machine code is made difficult by dynamic transfers of control (DTC), where the address of the next instruction is computed at run-time. Switchcase statements make compilers generate a large variety of machine-code forms with DTC. Two analysis approaches are commonly used: pattern-matching methods identify predefined instruction patterns to extract the target addresses, while analytical methods try to compute the set of target addresses using a general value-analysis. We tested the abstract execution method of the SWEET tool as a value analysis for switch-case code. SWEET is here used as a plugin to the Bound-T tool: thus our work can also be seen as an experiment in modular tool design, where a general value-analysis tool is used to aid the CFG construction in a WCET analysis tool. We find that the abstract-execution analysis works at least as well as the switch-case analyses in Bound-T itself, which are mostly based on pattern-matching. However, there are still some weaknesses: the abstract domains available in SWEET are not well suited to representing sets of DTC target addresses, which are small but sparse and irregular. Also, in some cases the abstract-execution analysis fails because the used domain is not relational, that is, does not model arithmetic relationships between the values of different variables. Future work will be directed towards the design of abstract domains eliminating these weaknesses.

Cite as

Niklas Holsti, Jan Gustafsson, Linus Källberg, and Björn Lisper. Analysing Switch-Case Code with Abstract Execution. In 15th International Workshop on Worst-Case Execution Time Analysis (WCET 2015). Open Access Series in Informatics (OASIcs), Volume 47, pp. 85-94, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{holsti_et_al:OASIcs.WCET.2015.85,
  author =	{Holsti, Niklas and Gustafsson, Jan and K\"{a}llberg, Linus and Lisper, Bj\"{o}rn},
  title =	{{Analysing Switch-Case Code with Abstract Execution}},
  booktitle =	{15th International Workshop on Worst-Case Execution Time Analysis (WCET 2015)},
  pages =	{85--94},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-95-8},
  ISSN =	{2190-6807},
  year =	{2015},
  volume =	{47},
  editor =	{Cazorla, Francisco J.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2015.85},
  URN =		{urn:nbn:de:0030-drops-52598},
  doi =		{10.4230/OASIcs.WCET.2015.85},
  annote =	{Keywords: ynamic control flow, indexed branch, machine-code analysis, WCET analysis}
}
Document
Principles for Value Annotation Languages

Authors: Björn Lisper

Published in: OASIcs, Volume 39, 14th International Workshop on Worst-Case Execution Time Analysis (2014)


Abstract
Tools for code-level program analysis need formats to express various properties, like relevant properties of the environment where the analysed code will execute, and the analysis results. Different WCET analysis tools typically use tool-specific annotation languages for this purpose. These languages are often geared towards expressing properties that the particular tool can handle rather than being general, and mostly their semantics is only specified informally. This makes it harder for tools to communicate, as well as for users to provide relevant information to them. Here, we propose a small but general assertion language for value constraints including IPET flow facts, which is an important class of annotations for WCET analysis tools. We show how to express interesting properties in this language, we propose some syntactic conveniences, and we give the language a formal semantics. The language could be used directly as a tool-independent annotation language, or as a meta-language to give exact semantics to existing value annotation and flow fact formats.

Cite as

Björn Lisper. Principles for Value Annotation Languages. In 14th International Workshop on Worst-Case Execution Time Analysis. Open Access Series in Informatics (OASIcs), Volume 39, pp. 1-10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@InProceedings{lisper:OASIcs.WCET.2014.1,
  author =	{Lisper, Bj\"{o}rn},
  title =	{{Principles for Value Annotation Languages}},
  booktitle =	{14th International Workshop on Worst-Case Execution Time Analysis},
  pages =	{1--10},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-69-9},
  ISSN =	{2190-6807},
  year =	{2014},
  volume =	{39},
  editor =	{Falk, Heiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2014.1},
  URN =		{urn:nbn:de:0030-drops-45996},
  doi =		{10.4230/OASIcs.WCET.2014.1},
  annote =	{Keywords: Real-Time System, WCET analysis, Flow Fact, Assertion}
}
Document
Complete Volume
OASIcs, Volume 15, WCET'10, Complete Volume

Authors: Björn Lisper

Published in: OASIcs, Volume 15, 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)


Abstract
OASIcs, Volume 15, WCET'10, Complete Volume

Cite as

10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010). Open Access Series in Informatics (OASIcs), Volume 15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@Proceedings{lisper:OASIcs.WCET.2010,
  title =	{{OASIcs, Volume 15, WCET'10, Complete Volume}},
  booktitle =	{10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-21-7},
  ISSN =	{2190-6807},
  year =	{2012},
  volume =	{15},
  editor =	{Lisper, Bj\"{o}rn},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2010},
  URN =		{urn:nbn:de:0030-drops-35771},
  doi =		{10.4230/OASIcs.WCET.2010},
  annote =	{Keywords: Performance of Systems, Software/Program Verification, Computers in Other Systems}
}
Document
Toward Static Timing Analysis of Parallel Software

Authors: Andreas Gustavsson, Jan Gustafsson, and Björn Lisper

Published in: OASIcs, Volume 23, 12th International Workshop on Worst-Case Execution Time Analysis (2012)


Abstract
The current trend within computer, and even real-time, systems is to incorporate parallel hardware, e.g., multicore processors, and parallel software. Thus, the ability to safely analyse such parallel systems, e.g., regarding the timing behaviour, becomes necessary. Static timing analysis is an approach to mathematically derive safe bounds on the execution time of a program, when executed on a given hardware platform. This paper presents an algorithm that statically analyses the timing of parallel software, with threads communicating through shared memory, using abstract interpretation. It also gives an extensive example to clarify how the algorithm works.

Cite as

Andreas Gustavsson, Jan Gustafsson, and Björn Lisper. Toward Static Timing Analysis of Parallel Software. In 12th International Workshop on Worst-Case Execution Time Analysis. Open Access Series in Informatics (OASIcs), Volume 23, pp. 38-47, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{gustavsson_et_al:OASIcs.WCET.2012.38,
  author =	{Gustavsson, Andreas and Gustafsson, Jan and Lisper, Bj\"{o}rn},
  title =	{{Toward Static Timing Analysis of Parallel Software}},
  booktitle =	{12th International Workshop on Worst-Case Execution Time Analysis},
  pages =	{38--47},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-41-5},
  ISSN =	{2190-6807},
  year =	{2012},
  volume =	{23},
  editor =	{Vardanega, Tullio},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2012.38},
  URN =		{urn:nbn:de:0030-drops-35552},
  doi =		{10.4230/OASIcs.WCET.2012.38},
  annote =	{Keywords: Parallelism, BCET, WCET, Static analysis, Abstract interpretation}
}
Document
Towards Parallel Programming Models for Predictability

Authors: Björn Lisper

Published in: OASIcs, Volume 23, 12th International Workshop on Worst-Case Execution Time Analysis (2012)


Abstract
Future embedded systems for performance-demanding applications will be massively parallel. High performance tasks will be parallel programs, running on several cores, rather than single threads running on single cores. For hard real-time applications, WCETs for such tasks must be bounded. Low-level parallel programming models, based on concurrent threads, are notoriously hard to use due to their inherent nondeterminism. Therefore the parallel processing community has long considered high-level parallel programming models, which restrict the low-level models to regain determinism. In this position paper we argue that such parallel programming models are beneficial also for WCET analysis of parallel programs. We review some proposed models, and discuss their influence on timing predictability. In particular we identify data parallel programming as a suitable paradigm as it is deterministic and allows current methods for WCET analysis to be extended to parallel code. GPUs are increasingly used for high performance applications: we discuss a current GPU architecture, and we argue that it offers a parallel platform for compute-intensive applications for which it seems possible to construct precise timing models. Thus, a promising route for the future is to develop WCET analyses for data-parallel software running on GPUs.

Cite as

Björn Lisper. Towards Parallel Programming Models for Predictability. In 12th International Workshop on Worst-Case Execution Time Analysis. Open Access Series in Informatics (OASIcs), Volume 23, pp. 48-58, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{lisper:OASIcs.WCET.2012.48,
  author =	{Lisper, Bj\"{o}rn},
  title =	{{Towards Parallel Programming Models for Predictability}},
  booktitle =	{12th International Workshop on Worst-Case Execution Time Analysis},
  pages =	{48--58},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-41-5},
  ISSN =	{2190-6807},
  year =	{2012},
  volume =	{23},
  editor =	{Vardanega, Tullio},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2012.48},
  URN =		{urn:nbn:de:0030-drops-35565},
  doi =		{10.4230/OASIcs.WCET.2012.48},
  annote =	{Keywords: Real-Time System, WCET analysis, Parallel Program, Data Parallelism}
}
Document
Front Matter
Frontmatter, Preface, Table of Contents, Workshop Organization

Authors: Björn Lisper

Published in: OASIcs, Volume 15, 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)


Abstract
Frontmatter, Preface, Table of Contents, Workshop Organization.

Cite as

10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010). Open Access Series in Informatics (OASIcs), Volume 15, pp. i-ix, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{lisper:OASIcs.WCET.2010.i,
  author =	{Lisper, Bj\"{o}rn},
  title =	{{Frontmatter, Preface, Table of Contents, Workshop Organization}},
  booktitle =	{10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)},
  pages =	{i--ix},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-21-7},
  ISSN =	{2190-6807},
  year =	{2010},
  volume =	{15},
  editor =	{Lisper, Bj\"{o}rn},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2010.i},
  URN =		{urn:nbn:de:0030-drops-28195},
  doi =		{10.4230/OASIcs.WCET.2010.i},
  annote =	{Keywords: Frontmatter, Preface, Table of Contents, Workshop Organization}
}
Document
Timing Anomalies Reloaded

Authors: Gernot Gebhard

Published in: OASIcs, Volume 15, 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)


Abstract
Computing tight WCET bounds in the presence of timing anomalies - found in almost any modern hardware architecture - is a major challenge of timing analysis. In this paper, we renew the discussion about timing anomalies, demonstrating that even simple hardware architectures are prone to timing anomalies. We furthermore complete the list of timing-anomalous cache replacement policies, proving that the most-recently-used replacement policy (MRU) also exhibits a domino effect.

Cite as

Gernot Gebhard. Timing Anomalies Reloaded. In 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010). Open Access Series in Informatics (OASIcs), Volume 15, pp. 1-10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{gebhard:OASIcs.WCET.2010.1,
  author =	{Gebhard, Gernot},
  title =	{{Timing Anomalies Reloaded}},
  booktitle =	{10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)},
  pages =	{1--10},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-21-7},
  ISSN =	{2190-6807},
  year =	{2010},
  volume =	{15},
  editor =	{Lisper, Bj\"{o}rn},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2010.1},
  URN =		{urn:nbn:de:0030-drops-28201},
  doi =		{10.4230/OASIcs.WCET.2010.1},
  annote =	{Keywords: Timing Anomalies, Domino Effects, MRU Replacement Policy, LEON2}
}
Document
Bounding the Effects of Resource Access Protocols on Cache Behavior

Authors: Enrico Mezzetti, Marco Panunzio, and Tullio Vardanega

Published in: OASIcs, Volume 15, 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)


Abstract
The assumption of task independence has long been consubstantial with the formulation of many schedulability analysis techniques. That assumption is evidently advantageous for the mathematical formulation of the analysis equations, but ill fit to capture the actual behavior of the system. Resource sharing is one of the system design dimensions that break the assumption of task independence. By shaking the very foundations of the real-time analysis theory, the advent of multicore systems has caused resurgence of interest in resource sharing and synchronization protocols, and also dawned the fact that the assumption of task independence may be forever broken. Research in cache-aware schedulability analysis instead has paid very little attention to the impact that synchronization protocols may have on cache behavior. A blocked task may in fact incur time penalties similar in kind to those caused by preemption, in that some useful code or data already loaded in the cache may be evicted while the task is blocked. In this paper we characterize the sources of cache-related blocking delay (CRBD). We then provide a bound on the CRBD for three synchronization protocols of interest. The comparison between these bounds provides striking evidence that an informed choice of the synchronization protocol helps contain the perturbing effects of blocking on the cache state.

Cite as

Enrico Mezzetti, Marco Panunzio, and Tullio Vardanega. Bounding the Effects of Resource Access Protocols on Cache Behavior. In 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010). Open Access Series in Informatics (OASIcs), Volume 15, pp. 11-22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{mezzetti_et_al:OASIcs.WCET.2010.11,
  author =	{Mezzetti, Enrico and Panunzio, Marco and Vardanega, Tullio},
  title =	{{Bounding the Effects of Resource Access Protocols on Cache Behavior}},
  booktitle =	{10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)},
  pages =	{11--22},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-21-7},
  ISSN =	{2190-6807},
  year =	{2010},
  volume =	{15},
  editor =	{Lisper, Bj\"{o}rn},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2010.11},
  URN =		{urn:nbn:de:0030-drops-28217},
  doi =		{10.4230/OASIcs.WCET.2010.11},
  annote =	{Keywords: Resource access protocols, cache, worst-case response time}
}
Document
Toward Precise PLRU Cache Analysis

Authors: Daniel Grund and Jan Reineke

Published in: OASIcs, Volume 15, 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)


Abstract
Schedulability analysis for hard real-time systems requires bounds on the execution times of its tasks. To obtain useful bounds in the presence of caches, cache analysis is mandatory. The subject-matter of this article is the static analysis of the tree-based PLRU cache replacement policy (pseudo least-recently used), for which the precision of analyses lags behind those of other policies. We introduce the term subtree distance, which is important for the update behavior of PLRU and closely linked to the peculiarity of PLRU that allows cache contents to be evicted in "logarithmic time". Based on an abstraction of subtree distance, we define a must-analysis that is more precise than prior ones by excluding spurious logarithmic-time eviction.

Cite as

Daniel Grund and Jan Reineke. Toward Precise PLRU Cache Analysis. In 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010). Open Access Series in Informatics (OASIcs), Volume 15, pp. 23-35, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{grund_et_al:OASIcs.WCET.2010.23,
  author =	{Grund, Daniel and Reineke, Jan},
  title =	{{Toward Precise PLRU Cache Analysis}},
  booktitle =	{10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)},
  pages =	{23--35},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-21-7},
  ISSN =	{2190-6807},
  year =	{2010},
  volume =	{15},
  editor =	{Lisper, Bj\"{o}rn},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2010.23},
  URN =		{urn:nbn:de:0030-drops-28226},
  doi =		{10.4230/OASIcs.WCET.2010.23},
  annote =	{Keywords: Cache Analysis, PLRU Replacement, PLRU Tree}
}
Document
Integrating Abstract Caches with Symbolic Pipeline Analysis

Authors: Stephan Wilhelm and Christoph Cullmann

Published in: OASIcs, Volume 15, 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)


Abstract
Static worst-case execution time analysis of real-time tasks is based on abstract models that capture the timing behavior of the processor on which the tasks run. For complex processors, task-level execution time bounds are obtained by a state space exploration which involves the abstract model and the program. Partial state space exploration is not sound. Symbolic methods using binary decision diagrams (BDDs) allow for a full state space exploration of the pipeline, thereby maintaining soundness. Caches are too large to admit an efficient BDD representation. On the other hand, invariants of the cache state can be computed efficiently using abstract interpretation. How to integrate abstract caches with symbolic-state pipeline analysis is an open question. We propose a semi-symbolic domain to solve this problem. Statistical data from industrial-level software and WCET tools indicate that this new domain will enable an efficient analysis.

Cite as

Stephan Wilhelm and Christoph Cullmann. Integrating Abstract Caches with Symbolic Pipeline Analysis. In 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010). Open Access Series in Informatics (OASIcs), Volume 15, pp. 36-43, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{wilhelm_et_al:OASIcs.WCET.2010.36,
  author =	{Wilhelm, Stephan and Cullmann, Christoph},
  title =	{{Integrating Abstract Caches with Symbolic Pipeline Analysis}},
  booktitle =	{10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)},
  pages =	{36--43},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-21-7},
  ISSN =	{2190-6807},
  year =	{2010},
  volume =	{15},
  editor =	{Lisper, Bj\"{o}rn},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2010.36},
  URN =		{urn:nbn:de:0030-drops-28235},
  doi =		{10.4230/OASIcs.WCET.2010.36},
  annote =	{Keywords: WCET analysis, cache analysis, pipeline analysis}
}
Document
Realism in Statistical Analysis of Worst Case Execution Times

Authors: David Griffin and Alan Burns

Published in: OASIcs, Volume 15, 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)


Abstract
This paper considers the use of Extreme Value Theory (EVT) to model worst-case execution times. In particular it considers the sacrifice that statistical methods make in the realism of their models in order to provide generality and precision, and if the sacrifice of realism can impact the safety of the model. The Gumbel distribution is assessed in terms of its assumption of continuous behaviour and its need for independent and identically distributed data. To ensure that predictions made by EVT estimations are safe, additional restrictions on their use are proposed and justified.

Cite as

David Griffin and Alan Burns. Realism in Statistical Analysis of Worst Case Execution Times. In 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010). Open Access Series in Informatics (OASIcs), Volume 15, pp. 44-53, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{griffin_et_al:OASIcs.WCET.2010.44,
  author =	{Griffin, David and Burns, Alan},
  title =	{{Realism in Statistical Analysis of Worst Case Execution Times}},
  booktitle =	{10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)},
  pages =	{44--53},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-21-7},
  ISSN =	{2190-6807},
  year =	{2010},
  volume =	{15},
  editor =	{Lisper, Bj\"{o}rn},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2010.44},
  URN =		{urn:nbn:de:0030-drops-28245},
  doi =		{10.4230/OASIcs.WCET.2010.44},
  annote =	{Keywords: WCET, Extreme value statistics, Gumbel distribution}
}
Document
Hybrid measurement-based WCET analysis at the source level using object-level traces

Authors: Adam Betts, Nicholas Merriam, and Guillem Bernat

Published in: OASIcs, Volume 15, 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)


Abstract
Hybrid measurement-based approaches to worst-case execution time (WCET) analysis combine measured execution times of small program segments using static analysis of the larger software structure. In order to make the necessary measurements, instrumentation code is added to generate a timestamped trace from the running program. The intrusive presence of this instrumentation code incurs a timing penalty, widely referred to as the probe effect. However, recent years have seen the emergence of trace capability at the hardware level, effectively opening the door to probe-free analysis. Relying on hardware support forces the WCET analysis to the object-code level, since that is all that is known by the hardware. A major disadvantage of this is that it is expensive for a typical software engineer to interpret the results, since most engineers are familiar with the source code but not the object code. Meaningful WCET analysis involves not just running a tool to obtain an overall WCET value but also understanding which sections of code consume most of the WCET in order that corrective actions, such as optimisation, can be applied if the WCET value is too large. The main contribution of this paper is a mechanism by which hybrid WCET analysis can still be performed at the source level when the timestamped trace has been collected at the object level by state-of-the-art hardware. This allows existing, commercial tools, such as \rapitime{}, to operate without the need for intrusive instrumentation and thus without the probe effect.

Cite as

Adam Betts, Nicholas Merriam, and Guillem Bernat. Hybrid measurement-based WCET analysis at the source level using object-level traces. In 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010). Open Access Series in Informatics (OASIcs), Volume 15, pp. 54-63, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{betts_et_al:OASIcs.WCET.2010.54,
  author =	{Betts, Adam and Merriam, Nicholas and Bernat, Guillem},
  title =	{{Hybrid measurement-based WCET analysis at the source level using object-level traces}},
  booktitle =	{10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)},
  pages =	{54--63},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-21-7},
  ISSN =	{2190-6807},
  year =	{2010},
  volume =	{15},
  editor =	{Lisper, Bj\"{o}rn},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2010.54},
  URN =		{urn:nbn:de:0030-drops-28255},
  doi =		{10.4230/OASIcs.WCET.2010.54},
  annote =	{Keywords: WCET analysis, hybrid analysis, trace}
}
Document
On the Use of Context Information for Precise Measurement-Based Execution Time Estimation

Authors: Stefan Stattelmann and Florian Martin

Published in: OASIcs, Volume 15, 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)


Abstract
The present paper investigates the influence of the execution history on the precision of measurement-based execution time estimates for embedded software. A new approach to timing analysis is presented which was designed to overcome the problems of existing static and dynamic methods. By partitioning the analyzed programs into easily traceable segments and by precisely controlling run-time measurements with on-chip tracing facilities, the new method is able to preserve information about the execution context of measured execution times. After an adequate number of measurements have been taken, this information can be used to precisely estimate the Worst-Case Execution Time of a program without being overly pessimistic.

Cite as

Stefan Stattelmann and Florian Martin. On the Use of Context Information for Precise Measurement-Based Execution Time Estimation. In 10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010). Open Access Series in Informatics (OASIcs), Volume 15, pp. 64-76, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{stattelmann_et_al:OASIcs.WCET.2010.64,
  author =	{Stattelmann, Stefan and Martin, Florian},
  title =	{{On the Use of Context Information for Precise Measurement-Based Execution Time Estimation}},
  booktitle =	{10th International Workshop on Worst-Case Execution Time Analysis (WCET 2010)},
  pages =	{64--76},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-21-7},
  ISSN =	{2190-6807},
  year =	{2010},
  volume =	{15},
  editor =	{Lisper, Bj\"{o}rn},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2010.64},
  URN =		{urn:nbn:de:0030-drops-28269},
  doi =		{10.4230/OASIcs.WCET.2010.64},
  annote =	{Keywords: WCET analysis, trace, execution time measurement}
}
  • Refine by Author
  • 15 Lisper, Björn
  • 6 Ermedahl, Andreas
  • 6 Gustafsson, Jan
  • 3 Altmeyer, Sebastian
  • 3 Rochange, Christine
  • Show More...

  • Refine by Classification

  • Refine by Keyword
  • 16 WCET analysis
  • 4 WCET
  • 2 Real-Time System
  • 2 UPPAAL
  • 2 abstract interpretation
  • Show More...

  • Refine by Type
  • 27 document

  • Refine by Publication Year
  • 15 2010
  • 3 2012
  • 2 2007
  • 2 2015
  • 1 2006
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail