12 Search Results for "Clement, Julien"


Document
Generalizing Shape Analysis with Gradual Types

Authors: Zeina Migeed, James Reed, Jason Ansel, and Jens Palsberg

Published in: LIPIcs, Volume 313, 38th European Conference on Object-Oriented Programming (ECOOP 2024)


Abstract
Tensors are multi-dimensional data structures that can represent the data processed by machine learning tasks. Tensor programs tend to be short and readable, and they can leverage libraries and frameworks such as TensorFlow and PyTorch, as well as modern hardware such as GPUs and TPUs. However, tensor programs also tend to obscure shape information, which can cause shape errors that are difficult to find. Such shape errors can be avoided by a combination of shape annotations and shape analysis, but such annotations are burdensome to come up with manually. In this paper, we use gradual typing to reduce the barrier of entry. Gradual typing offers a way to incrementally introduce type annotations into programs. From there, we focus on tool support for type migration, which is a concept that closely models code-annotation tasks and allows us to do shape reasoning and utilize it for different purposes. We develop a comprehensive gradual typing theory to reason about tensor shapes. We then ask three fundamental questions about a gradually typed tensor program. (1) Does the program have a static migration? (2) Given a program and some arithmetic constraints on shapes, can we migrate the program according to the constraints? (3) Can we eliminate branches that depend on shapes? We develop novel tools to address the three problems. For the third problem, there are currently two PyTorch tools that aim to eliminate branches. They do so by eliminating them for just a single input. Our tool is the first to eliminate branches for an infinite class of inputs, using static shape information. Our tools help prevent bugs, alleviate the burden on the programmer of annotating the program, and improves the process of program transformation.

Cite as

Zeina Migeed, James Reed, Jason Ansel, and Jens Palsberg. Generalizing Shape Analysis with Gradual Types. In 38th European Conference on Object-Oriented Programming (ECOOP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 313, pp. 29:1-29:28, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{migeed_et_al:LIPIcs.ECOOP.2024.29,
  author =	{Migeed, Zeina and Reed, James and Ansel, Jason and Palsberg, Jens},
  title =	{{Generalizing Shape Analysis with Gradual Types}},
  booktitle =	{38th European Conference on Object-Oriented Programming (ECOOP 2024)},
  pages =	{29:1--29:28},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-341-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{313},
  editor =	{Aldrich, Jonathan and Salvaneschi, Guido},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ECOOP.2024.29},
  URN =		{urn:nbn:de:0030-drops-208786},
  doi =		{10.4230/LIPIcs.ECOOP.2024.29},
  annote =	{Keywords: Tensor Shapes, Gradual Types, Migration}
}
Document
On Fluctuations of Complexity Measures for the FIND Algorithm

Authors: Jasper Ischebeck and Ralph Neininger

Published in: LIPIcs, Volume 302, 35th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2024)


Abstract
The FIND algorithm (also called Quickselect) is a fundamental algorithm to select ranks or quantiles within a set of data. It was shown by Grübel and Rösler that the number of key comparisons required by FIND as a process of the quantiles α ∈ [0,1] in a natural probabilistic model converges after normalization in distribution within the càdlàg space D[0,1] endowed with the Skorokhod metric. We show that the process of the residuals in the latter convergence after normalization converges in distribution to a mixture of Gaussian processes in D[0,1] and identify the limit’s conditional covariance functions. A similar result holds for the related algorithm QuickVal. Our method extends to other cost measures such as the number of swaps (key exchanges) required by FIND or cost measures which are based on key comparisons but take into account that the cost of a comparison between two keys may depend on their values, an example being the number of bit comparisons needed to compare keys given by their bit expansions.

Cite as

Jasper Ischebeck and Ralph Neininger. On Fluctuations of Complexity Measures for the FIND Algorithm. In 35th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 302, pp. 9:1-9:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{ischebeck_et_al:LIPIcs.AofA.2024.9,
  author =	{Ischebeck, Jasper and Neininger, Ralph},
  title =	{{On Fluctuations of Complexity Measures for the FIND Algorithm}},
  booktitle =	{35th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2024)},
  pages =	{9:1--9:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-329-4},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{302},
  editor =	{Mailler, C\'{e}cile and Wild, Sebastian},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.AofA.2024.9},
  URN =		{urn:nbn:de:0030-drops-204440},
  doi =		{10.4230/LIPIcs.AofA.2024.9},
  annote =	{Keywords: FIND, Quickselect, rank selection, probabilistic analysis of algorithms, weak convergence, functional limit theorem}
}
Document
Patricia’s Bad Distributions

Authors: Louigi Addario-Berry, Pat Morin, and Ralph Neininger

Published in: LIPIcs, Volume 302, 35th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2024)


Abstract
The height of a random PATRICIA tree built from independent, identically distributed infinite binary strings with arbitrary diffuse probability distribution μ on {0,1}^ℕ is studied. We show that the expected height grows asymptotically sublinearly in the number of leaves for any such μ, but can be made to exceed any specific sublinear growth rate by choosing μ appropriately.

Cite as

Louigi Addario-Berry, Pat Morin, and Ralph Neininger. Patricia’s Bad Distributions. In 35th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 302, pp. 25:1-25:8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{addarioberry_et_al:LIPIcs.AofA.2024.25,
  author =	{Addario-Berry, Louigi and Morin, Pat and Neininger, Ralph},
  title =	{{Patricia’s Bad Distributions}},
  booktitle =	{35th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2024)},
  pages =	{25:1--25:8},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-329-4},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{302},
  editor =	{Mailler, C\'{e}cile and Wild, Sebastian},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.AofA.2024.25},
  URN =		{urn:nbn:de:0030-drops-204600},
  doi =		{10.4230/LIPIcs.AofA.2024.25},
  annote =	{Keywords: PATRICIA tree, random tree, height of tree, analysis of algorithms}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
FO Logic on Cellular Automata Orbits Equals MSO Logic

Authors: Guillaume Theyssier

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
We introduce an extension of classical cellular automata (CA) to arbitrary labeled graphs, and show that FO logic on CA orbits is equivalent to MSO logic. We deduce various results from that equivalence, including a characterization of finitely generated groups on which FO model checking for CA orbits is undecidable, and undecidability of satisfiability of a fixed FO property for CA over finite graphs. We also show concrete examples of FO formulas for CA orbits whose model checking problem is equivalent to the domino problem, or its seeded or recurring variants respectively, on any finitely generated group. For the recurring domino problem, we use an extension of the FO signature by a relation found in the well-known Garden of Eden theorem, but we also show a concrete FO formula without the extension and with one quantifier alternation whose model checking problem does not belong to the arithmetical hierarchy on group ℤ².

Cite as

Guillaume Theyssier. FO Logic on Cellular Automata Orbits Equals MSO Logic. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 154:1-154:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{theyssier:LIPIcs.ICALP.2024.154,
  author =	{Theyssier, Guillaume},
  title =	{{FO Logic on Cellular Automata Orbits Equals MSO Logic}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{154:1--154:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.154},
  URN =		{urn:nbn:de:0030-drops-202972},
  doi =		{10.4230/LIPIcs.ICALP.2024.154},
  annote =	{Keywords: MSO logic, FO logic, cellular automata, domino problem, Cayley graphs}
}
Document
An Iterative Approach for Counting Reduced Ordered Binary Decision Diagrams

Authors: Julien Clément and Antoine Genitrini

Published in: LIPIcs, Volume 272, 48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023)


Abstract
For three decades binary decision diagrams, a data structure efficiently representing Boolean functions, have been widely used in many distinct contexts like model verification, machine learning, cryptography and also resolution of combinatorial problems. The most famous variant, called reduced ordered binary decision diagram (robdd for short), can be viewed as the result of a compaction procedure on the full decision tree. A useful property is that once an order over the Boolean variables is fixed, each Boolean function is represented by exactly one robdd. In this paper we aim at computing the {exact distribution of the Boolean functions in k variables according to the robdd size}, where the robdd size is equal to the number of decision nodes of the underlying directed acyclic graph (dag) structure. Recall the number of Boolean functions with k variables is equal to 2^{2^k}, which is of double exponential growth with respect to the number of variables. The maximal size of a robdd with k variables is M_k ≈ 2^k / k. Apart from the natural combinatorial explosion observed, another difficulty for computing the distribution according to size is to take into account dependencies within the dag structure of robdds. In this paper, we develop the first polynomial algorithm to derive the distribution of Boolean functions over k variables with respect to robdd size denoted by n. The algorithm computes the (enumerative) generating function of robdds with k variables up to size n. It performs O(k n⁴) arithmetical operations on integers and necessitates storing O((k+n) n²) integers with bit length O(nlog n). Our new approach relies on a decomposition of robdds layer by layer and on an inclusion-exclusion argument.

Cite as

Julien Clément and Antoine Genitrini. An Iterative Approach for Counting Reduced Ordered Binary Decision Diagrams. In 48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 272, pp. 36:1-36:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{clement_et_al:LIPIcs.MFCS.2023.36,
  author =	{Cl\'{e}ment, Julien and Genitrini, Antoine},
  title =	{{An Iterative Approach for Counting Reduced Ordered Binary Decision Diagrams}},
  booktitle =	{48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023)},
  pages =	{36:1--36:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-292-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{272},
  editor =	{Leroux, J\'{e}r\^{o}me and Lombardy, Sylvain and Peleg, David},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2023.36},
  URN =		{urn:nbn:de:0030-drops-185702},
  doi =		{10.4230/LIPIcs.MFCS.2023.36},
  annote =	{Keywords: Boolean Function, Reduced Ordered Binary Decision Diagram (\{robdd\}), Enumerative Combinatorics, Directed Acyclic Graph}
}
Document
A Smart Contract Oracle for Approximating Real-World, Real Number Values

Authors: William George and Clément Lesaege

Published in: OASIcs, Volume 71, International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2019)


Abstract
A key challenge of smart contract systems is the fact that many useful contracts require access to information that does not natively live on the blockchain. While miners can verify the value of a hash or the validity of a digital signature, they cannot determine who won an election, whether there is a flood in Paris, or even what is the price of ether in US dollars, even though this information might be necessary to execute prediction market, insurance, or financial contracts respectively. A number of promising projects and research developments have provided a better understanding of how one might construct a decentralized, binary oracle - namely an oracle that can respond by one of two possibilities, typically "yes" or "no", even while not requiring the interaction of a trusted third party. In this work, we extend these ideas to construct a general-purpose, decentralized oracle that can estimate the value of a real-world quantity that is in a dense totally ordered set, such as R. In particular, this proposal can be used to estimate real number valued quantities, such as required for a price oracle. We will establish a number of desirable properties about this proposal. Particularly, we will see that the precision of the output is tunable to users' needs.

Cite as

William George and Clément Lesaege. A Smart Contract Oracle for Approximating Real-World, Real Number Values. In International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2019). Open Access Series in Informatics (OASIcs), Volume 71, pp. 6:1-6:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{george_et_al:OASIcs.Tokenomics.2019.6,
  author =	{George, William and Lesaege, Cl\'{e}ment},
  title =	{{A Smart Contract Oracle for Approximating Real-World, Real Number Values}},
  booktitle =	{International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2019)},
  pages =	{6:1--6:15},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-108-5},
  ISSN =	{2190-6807},
  year =	{2020},
  volume =	{71},
  editor =	{Danos, Vincent and Herlihy, Maurice and Potop-Butucaru, Maria and Prat, Julien and Tucci-Piergiovanni, Sara},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2019.6},
  URN =		{urn:nbn:de:0030-drops-119705},
  doi =		{10.4230/OASIcs.Tokenomics.2019.6},
  annote =	{Keywords: price oracle, Ethereum, blockchain}
}
Document
Dichotomic Selection on Words: A Probabilistic Analysis

Authors: Ali Akhavi, Julien Clément, Dimitri Darthenay, Loïck Lhote, and Brigitte Vallée

Published in: LIPIcs, Volume 128, 30th Annual Symposium on Combinatorial Pattern Matching (CPM 2019)


Abstract
The paper studies the behaviour of selection algorithms that are based on dichotomy principles. On the entry formed by an ordered list L and a searched element x not in L, they return the interval of the list L the element x belongs to. We focus here on the case of words, where dichotomy principles lead to a selection algorithm designed by Crochemore, Hancart and Lecroq, which appears to be "quasi-optimal". We perform a probabilistic analysis of this algorithm that exhibits its quasi-optimality on average.

Cite as

Ali Akhavi, Julien Clément, Dimitri Darthenay, Loïck Lhote, and Brigitte Vallée. Dichotomic Selection on Words: A Probabilistic Analysis. In 30th Annual Symposium on Combinatorial Pattern Matching (CPM 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 128, pp. 19:1-19:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{akhavi_et_al:LIPIcs.CPM.2019.19,
  author =	{Akhavi, Ali and Cl\'{e}ment, Julien and Darthenay, Dimitri and Lhote, Lo\"{i}ck and Vall\'{e}e, Brigitte},
  title =	{{Dichotomic Selection on Words: A Probabilistic Analysis}},
  booktitle =	{30th Annual Symposium on Combinatorial Pattern Matching (CPM 2019)},
  pages =	{19:1--19:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-103-0},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{128},
  editor =	{Pisanti, Nadia and P. Pissis, Solon},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.CPM.2019.19},
  URN =		{urn:nbn:de:0030-drops-104903},
  doi =		{10.4230/LIPIcs.CPM.2019.19},
  annote =	{Keywords: dichotomic selection, text algorithms, analysis of algorithms, average case analysis of algorithms, trie, suffix array, lcp-array, information theory, numeration process, sources, entropy, coincidence, analytic combinatorics, depoissonization techniques}
}
Document
Asymptotic Distribution of Parameters in Random Maps

Authors: Olivier Bodini, Julien Courtiel, Sergey Dovgal, and Hsien-Kuei Hwang

Published in: LIPIcs, Volume 110, 29th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2018)


Abstract
We consider random rooted maps without regard to their genus, with fixed large number of edges, and address the problem of limiting distributions for six different parameters: vertices, leaves, loops, root edges, root isthmus, and root vertex degree. Each of these leads to a different limiting distribution, varying from (discrete) geometric and Poisson distributions to different continuous ones: Beta, normal, uniform, and an unusual distribution whose moments are characterised by a recursive triangular array.

Cite as

Olivier Bodini, Julien Courtiel, Sergey Dovgal, and Hsien-Kuei Hwang. Asymptotic Distribution of Parameters in Random Maps. In 29th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 110, pp. 13:1-13:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{bodini_et_al:LIPIcs.AofA.2018.13,
  author =	{Bodini, Olivier and Courtiel, Julien and Dovgal, Sergey and Hwang, Hsien-Kuei},
  title =	{{Asymptotic Distribution of Parameters in Random Maps}},
  booktitle =	{29th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2018)},
  pages =	{13:1--13:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-078-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{110},
  editor =	{Fill, James Allen and Ward, Mark Daniel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.AofA.2018.13},
  URN =		{urn:nbn:de:0030-drops-89069},
  doi =		{10.4230/LIPIcs.AofA.2018.13},
  annote =	{Keywords: Random maps, Analytic combinatorics, Rooted Maps, Beta law, Limit laws, Patterns, Generating functions, Riccati equation}
}
Document
The Operator Approach to Entropy Games

Authors: Marianne Akian, Stéphane Gaubert, Julien Grand-Clément, and Jérémie Guillaud

Published in: LIPIcs, Volume 66, 34th Symposium on Theoretical Aspects of Computer Science (STACS 2017)


Abstract
Entropy games and matrix multiplication games have been recently introduced by Asarin et al. They model the situation in which one player (Despot) wishes to minimize the growth rate of a matrix product, whereas the other player (Tribune) wishes to maximize it. We develop an operator approach to entropy games. This allows us to show that entropy games can be cast as stochastic mean payoff games in which some action spaces are simplices and payments are given by a relative entropy (Kullback-Leibler divergence). In this way, we show that entropy games with a fixed number of states belonging to Despot can be solved in polynomial time. This approach also allows us to solve these games by a policy iteration algorithm, which we compare with the spectral simplex algorithm developed by Protasov.

Cite as

Marianne Akian, Stéphane Gaubert, Julien Grand-Clément, and Jérémie Guillaud. The Operator Approach to Entropy Games. In 34th Symposium on Theoretical Aspects of Computer Science (STACS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 66, pp. 6:1-6:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{akian_et_al:LIPIcs.STACS.2017.6,
  author =	{Akian, Marianne and Gaubert, St\'{e}phane and Grand-Cl\'{e}ment, Julien and Guillaud, J\'{e}r\'{e}mie},
  title =	{{The Operator Approach to Entropy Games}},
  booktitle =	{34th Symposium on Theoretical Aspects of Computer Science (STACS 2017)},
  pages =	{6:1--6:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-028-6},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{66},
  editor =	{Vollmer, Heribert and Vall\'{e}e, Brigitte},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2017.6},
  URN =		{urn:nbn:de:0030-drops-70260},
  doi =		{10.4230/LIPIcs.STACS.2017.6},
  annote =	{Keywords: Stochastic games, Shapley operators, policy iteration, Perron eigenvalues, Risk sensitive control}
}
Document
Context-sensitive Parametric WCET Analysis

Authors: Clément Ballabriga, Julien Forget, and Giuseppe Lipari

Published in: OASIcs, Volume 47, 15th International Workshop on Worst-Case Execution Time Analysis (WCET 2015)


Abstract
In this paper, we propose aWCET analysis that focuses on two aspects. First, it supports contextsensitive hardware and software timing effects, meaning that it is sensitive to the execution history of the program and thus can account for effects like cache persistence, triangular loop, etc. Second, it supports the introduction of parameters in both the software model (e.g. parametric loop bounds) and the hardware model (e.g. number of cache misses). WCET computation by static analysis is traditionally handled by the Implicit Path Enumeration Technique (IPET), using an Integer Linear Program (ILP) that is difficult to resolve parametrically. We suggest an alternative tree-based approach. We define a context-sensitive CFG format to express these effects, and we provide an efficient method to process it, giving a parametric WCET formula. Experimental results show that this new method is significantly faster and more accurate than existing parametric approaches.

Cite as

Clément Ballabriga, Julien Forget, and Giuseppe Lipari. Context-sensitive Parametric WCET Analysis. In 15th International Workshop on Worst-Case Execution Time Analysis (WCET 2015). Open Access Series in Informatics (OASIcs), Volume 47, pp. 55-64, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{ballabriga_et_al:OASIcs.WCET.2015.55,
  author =	{Ballabriga, Cl\'{e}ment and Forget, Julien and Lipari, Giuseppe},
  title =	{{Context-sensitive Parametric WCET Analysis}},
  booktitle =	{15th International Workshop on Worst-Case Execution Time Analysis (WCET 2015)},
  pages =	{55--64},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-95-8},
  ISSN =	{2190-6807},
  year =	{2015},
  volume =	{47},
  editor =	{Cazorla, Francisco J.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2015.55},
  URN =		{urn:nbn:de:0030-drops-52569},
  doi =		{10.4230/OASIcs.WCET.2015.55},
  annote =	{Keywords: Parametric, WCET, Real-time, Static analysis}
}
Document
A general framework for the realistic analysis of sorting and searching algorithms. Application to some popular algorithms

Authors: Julien Clément, Thu Hien Nguyen Thi, and Brigitte Vallée

Published in: LIPIcs, Volume 20, 30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)


Abstract
We describe a general framework for realistic analysis of sorting and searching algorithms, and we apply it to the average-case analysis of five basic algorithms: three sorting algorithms (QuickSort, InsertionSort, BubbleSort) and two selection algorithms (QuickMin and SelectionMin). Usually, the analysis deals with the mean number of key comparisons, but, here, we view keys as words produced by the same source, which are compared via their symbols in the lexicographic order. The "realistic" cost of the algorithm is now the total number of symbol comparisons performed by the algorithm, and, in this context, the average-case analysis aims to providee stimates for the mean number of symbol comparisons used by the algorithm. For sorting algorithms, and with respect to key comparisons, the average-case complexity of QuickSort is asymptotic to 2n log n, InsertionSort to n^2/4 and BubbleSort to n^2/2. With respect to symbol comparisons, we prove that their average-case complexity becomes Theta(n log^2n), Theta(n^2), Theta (n^2 log n). For selection algorithms, and with respect to key comparisons, the average-case complexity of QuickMin is asymptotic to 2n, of SelectionMin is n - 1. With respect to symbol comparisons, we prove that their average-case complexity remains Theta(n). In these five cases, we describe the dominant constants which exhibit the probabilistic behaviour of the source (namely, entropy, and various notions of coincidence) with respect to the algorithm.

Cite as

Julien Clément, Thu Hien Nguyen Thi, and Brigitte Vallée. A general framework for the realistic analysis of sorting and searching algorithms. Application to some popular algorithms. In 30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 20, pp. 598-609, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{clement_et_al:LIPIcs.STACS.2013.598,
  author =	{Cl\'{e}ment, Julien and Nguyen Thi, Thu Hien and Vall\'{e}e, Brigitte},
  title =	{{A general framework for the realistic analysis of sorting and searching algorithms. Application to some popular algorithms}},
  booktitle =	{30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)},
  pages =	{598--609},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-50-7},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{20},
  editor =	{Portier, Natacha and Wilke, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2013.598},
  URN =		{urn:nbn:de:0030-drops-39681},
  doi =		{10.4230/LIPIcs.STACS.2013.598},
  annote =	{Keywords: Probabilistic analysis of algorithms, Sorting and searching algorithms, Pattern matching, Permutations, Information theory, Rice formula, Asymptotic e}
}
Document
Reverse Engineering Prefix Tables

Authors: Julien Clement, Maxime Crochemore, and Giuseppina Rindone

Published in: LIPIcs, Volume 3, 26th International Symposium on Theoretical Aspects of Computer Science (2009)


Abstract
The Prefix table of a string reports for each position the maximal length of its prefixes starting here. The Prefix table and its dual Suffix table are basic tools used in the design of the most efficient string-matching and pattern extraction algorithms. These tables can be computed in linear time independently of the alphabet size. We give an algorithmic characterisation of a Prefix table (it can be adapted to a Suffix table). Namely, the algorithm tests if an integer table of size $n$ is the Prefix table of some word and, if successful, it constructs the lexicographically smallest string having it as a Prefix table. We show that the alphabet of the string can be bounded to $\log_2 n$ letters. The overall algorithm runs in $O(n)$ time.

Cite as

Julien Clement, Maxime Crochemore, and Giuseppina Rindone. Reverse Engineering Prefix Tables. In 26th International Symposium on Theoretical Aspects of Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 3, pp. 289-300, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{clement_et_al:LIPIcs.STACS.2009.1825,
  author =	{Clement, Julien and Crochemore, Maxime and Rindone, Giuseppina},
  title =	{{Reverse Engineering Prefix Tables}},
  booktitle =	{26th International Symposium on Theoretical Aspects of Computer Science},
  pages =	{289--300},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-09-5},
  ISSN =	{1868-8969},
  year =	{2009},
  volume =	{3},
  editor =	{Albers, Susanne and Marion, Jean-Yves},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2009.1825},
  URN =		{urn:nbn:de:0030-drops-18258},
  doi =		{10.4230/LIPIcs.STACS.2009.1825},
  annote =	{Keywords: Design and analysis of algorithms, Algorithms on strings, Pattern matching, String matching, Combinatorics on words, Prefix table, Suffix table}
}
  • Refine by Author
  • 3 Clément, Julien
  • 2 Neininger, Ralph
  • 2 Vallée, Brigitte
  • 1 Addario-Berry, Louigi
  • 1 Akhavi, Ali
  • Show More...

  • Refine by Classification
  • 3 Theory of computation → Sorting and searching
  • 1 Information systems → Data compression
  • 1 Mathematics of computing → Combinatorial algorithms
  • 1 Mathematics of computing → Combinatoric problems
  • 1 Mathematics of computing → Combinatorics on words
  • Show More...

  • Refine by Keyword
  • 2 Pattern matching
  • 2 analysis of algorithms
  • 1 Algorithms on strings
  • 1 Analytic combinatorics
  • 1 Asymptotic e
  • Show More...

  • Refine by Type
  • 12 document

  • Refine by Publication Year
  • 4 2024
  • 1 2009
  • 1 2013
  • 1 2015
  • 1 2017
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail