8 Search Results for "Clement, Julien"


Document
An Iterative Approach for Counting Reduced Ordered Binary Decision Diagrams

Authors: Julien Clément and Antoine Genitrini

Published in: LIPIcs, Volume 272, 48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023)


Abstract
For three decades binary decision diagrams, a data structure efficiently representing Boolean functions, have been widely used in many distinct contexts like model verification, machine learning, cryptography and also resolution of combinatorial problems. The most famous variant, called reduced ordered binary decision diagram (robdd for short), can be viewed as the result of a compaction procedure on the full decision tree. A useful property is that once an order over the Boolean variables is fixed, each Boolean function is represented by exactly one robdd. In this paper we aim at computing the {exact distribution of the Boolean functions in k variables according to the robdd size}, where the robdd size is equal to the number of decision nodes of the underlying directed acyclic graph (dag) structure. Recall the number of Boolean functions with k variables is equal to 2^{2^k}, which is of double exponential growth with respect to the number of variables. The maximal size of a robdd with k variables is M_k ≈ 2^k / k. Apart from the natural combinatorial explosion observed, another difficulty for computing the distribution according to size is to take into account dependencies within the dag structure of robdds. In this paper, we develop the first polynomial algorithm to derive the distribution of Boolean functions over k variables with respect to robdd size denoted by n. The algorithm computes the (enumerative) generating function of robdds with k variables up to size n. It performs O(k n⁴) arithmetical operations on integers and necessitates storing O((k+n) n²) integers with bit length O(nlog n). Our new approach relies on a decomposition of robdds layer by layer and on an inclusion-exclusion argument.

Cite as

Julien Clément and Antoine Genitrini. An Iterative Approach for Counting Reduced Ordered Binary Decision Diagrams. In 48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 272, pp. 36:1-36:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{clement_et_al:LIPIcs.MFCS.2023.36,
  author =	{Cl\'{e}ment, Julien and Genitrini, Antoine},
  title =	{{An Iterative Approach for Counting Reduced Ordered Binary Decision Diagrams}},
  booktitle =	{48th International Symposium on Mathematical Foundations of Computer Science (MFCS 2023)},
  pages =	{36:1--36:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-292-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{272},
  editor =	{Leroux, J\'{e}r\^{o}me and Lombardy, Sylvain and Peleg, David},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2023.36},
  URN =		{urn:nbn:de:0030-drops-185702},
  doi =		{10.4230/LIPIcs.MFCS.2023.36},
  annote =	{Keywords: Boolean Function, Reduced Ordered Binary Decision Diagram (\{robdd\}), Enumerative Combinatorics, Directed Acyclic Graph}
}
Document
A Smart Contract Oracle for Approximating Real-World, Real Number Values

Authors: William George and Clément Lesaege

Published in: OASIcs, Volume 71, International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2019)


Abstract
A key challenge of smart contract systems is the fact that many useful contracts require access to information that does not natively live on the blockchain. While miners can verify the value of a hash or the validity of a digital signature, they cannot determine who won an election, whether there is a flood in Paris, or even what is the price of ether in US dollars, even though this information might be necessary to execute prediction market, insurance, or financial contracts respectively. A number of promising projects and research developments have provided a better understanding of how one might construct a decentralized, binary oracle - namely an oracle that can respond by one of two possibilities, typically "yes" or "no", even while not requiring the interaction of a trusted third party. In this work, we extend these ideas to construct a general-purpose, decentralized oracle that can estimate the value of a real-world quantity that is in a dense totally ordered set, such as R. In particular, this proposal can be used to estimate real number valued quantities, such as required for a price oracle. We will establish a number of desirable properties about this proposal. Particularly, we will see that the precision of the output is tunable to users' needs.

Cite as

William George and Clément Lesaege. A Smart Contract Oracle for Approximating Real-World, Real Number Values. In International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2019). Open Access Series in Informatics (OASIcs), Volume 71, pp. 6:1-6:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{george_et_al:OASIcs.Tokenomics.2019.6,
  author =	{George, William and Lesaege, Cl\'{e}ment},
  title =	{{A Smart Contract Oracle for Approximating Real-World, Real Number Values}},
  booktitle =	{International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2019)},
  pages =	{6:1--6:15},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-108-5},
  ISSN =	{2190-6807},
  year =	{2020},
  volume =	{71},
  editor =	{Danos, Vincent and Herlihy, Maurice and Potop-Butucaru, Maria and Prat, Julien and Tucci-Piergiovanni, Sara},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2019.6},
  URN =		{urn:nbn:de:0030-drops-119705},
  doi =		{10.4230/OASIcs.Tokenomics.2019.6},
  annote =	{Keywords: price oracle, Ethereum, blockchain}
}
Document
Dichotomic Selection on Words: A Probabilistic Analysis

Authors: Ali Akhavi, Julien Clément, Dimitri Darthenay, Loïck Lhote, and Brigitte Vallée

Published in: LIPIcs, Volume 128, 30th Annual Symposium on Combinatorial Pattern Matching (CPM 2019)


Abstract
The paper studies the behaviour of selection algorithms that are based on dichotomy principles. On the entry formed by an ordered list L and a searched element x not in L, they return the interval of the list L the element x belongs to. We focus here on the case of words, where dichotomy principles lead to a selection algorithm designed by Crochemore, Hancart and Lecroq, which appears to be "quasi-optimal". We perform a probabilistic analysis of this algorithm that exhibits its quasi-optimality on average.

Cite as

Ali Akhavi, Julien Clément, Dimitri Darthenay, Loïck Lhote, and Brigitte Vallée. Dichotomic Selection on Words: A Probabilistic Analysis. In 30th Annual Symposium on Combinatorial Pattern Matching (CPM 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 128, pp. 19:1-19:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{akhavi_et_al:LIPIcs.CPM.2019.19,
  author =	{Akhavi, Ali and Cl\'{e}ment, Julien and Darthenay, Dimitri and Lhote, Lo\"{i}ck and Vall\'{e}e, Brigitte},
  title =	{{Dichotomic Selection on Words: A Probabilistic Analysis}},
  booktitle =	{30th Annual Symposium on Combinatorial Pattern Matching (CPM 2019)},
  pages =	{19:1--19:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-103-0},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{128},
  editor =	{Pisanti, Nadia and P. Pissis, Solon},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CPM.2019.19},
  URN =		{urn:nbn:de:0030-drops-104903},
  doi =		{10.4230/LIPIcs.CPM.2019.19},
  annote =	{Keywords: dichotomic selection, text algorithms, analysis of algorithms, average case analysis of algorithms, trie, suffix array, lcp-array, information theory, numeration process, sources, entropy, coincidence, analytic combinatorics, depoissonization techniques}
}
Document
Asymptotic Distribution of Parameters in Random Maps

Authors: Olivier Bodini, Julien Courtiel, Sergey Dovgal, and Hsien-Kuei Hwang

Published in: LIPIcs, Volume 110, 29th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2018)


Abstract
We consider random rooted maps without regard to their genus, with fixed large number of edges, and address the problem of limiting distributions for six different parameters: vertices, leaves, loops, root edges, root isthmus, and root vertex degree. Each of these leads to a different limiting distribution, varying from (discrete) geometric and Poisson distributions to different continuous ones: Beta, normal, uniform, and an unusual distribution whose moments are characterised by a recursive triangular array.

Cite as

Olivier Bodini, Julien Courtiel, Sergey Dovgal, and Hsien-Kuei Hwang. Asymptotic Distribution of Parameters in Random Maps. In 29th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 110, pp. 13:1-13:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{bodini_et_al:LIPIcs.AofA.2018.13,
  author =	{Bodini, Olivier and Courtiel, Julien and Dovgal, Sergey and Hwang, Hsien-Kuei},
  title =	{{Asymptotic Distribution of Parameters in Random Maps}},
  booktitle =	{29th International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2018)},
  pages =	{13:1--13:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-078-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{110},
  editor =	{Fill, James Allen and Ward, Mark Daniel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.AofA.2018.13},
  URN =		{urn:nbn:de:0030-drops-89069},
  doi =		{10.4230/LIPIcs.AofA.2018.13},
  annote =	{Keywords: Random maps, Analytic combinatorics, Rooted Maps, Beta law, Limit laws, Patterns, Generating functions, Riccati equation}
}
Document
The Operator Approach to Entropy Games

Authors: Marianne Akian, Stéphane Gaubert, Julien Grand-Clément, and Jérémie Guillaud

Published in: LIPIcs, Volume 66, 34th Symposium on Theoretical Aspects of Computer Science (STACS 2017)


Abstract
Entropy games and matrix multiplication games have been recently introduced by Asarin et al. They model the situation in which one player (Despot) wishes to minimize the growth rate of a matrix product, whereas the other player (Tribune) wishes to maximize it. We develop an operator approach to entropy games. This allows us to show that entropy games can be cast as stochastic mean payoff games in which some action spaces are simplices and payments are given by a relative entropy (Kullback-Leibler divergence). In this way, we show that entropy games with a fixed number of states belonging to Despot can be solved in polynomial time. This approach also allows us to solve these games by a policy iteration algorithm, which we compare with the spectral simplex algorithm developed by Protasov.

Cite as

Marianne Akian, Stéphane Gaubert, Julien Grand-Clément, and Jérémie Guillaud. The Operator Approach to Entropy Games. In 34th Symposium on Theoretical Aspects of Computer Science (STACS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 66, pp. 6:1-6:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{akian_et_al:LIPIcs.STACS.2017.6,
  author =	{Akian, Marianne and Gaubert, St\'{e}phane and Grand-Cl\'{e}ment, Julien and Guillaud, J\'{e}r\'{e}mie},
  title =	{{The Operator Approach to Entropy Games}},
  booktitle =	{34th Symposium on Theoretical Aspects of Computer Science (STACS 2017)},
  pages =	{6:1--6:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-028-6},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{66},
  editor =	{Vollmer, Heribert and Vall\'{e}e, Brigitte},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2017.6},
  URN =		{urn:nbn:de:0030-drops-70260},
  doi =		{10.4230/LIPIcs.STACS.2017.6},
  annote =	{Keywords: Stochastic games, Shapley operators, policy iteration, Perron eigenvalues, Risk sensitive control}
}
Document
Context-sensitive Parametric WCET Analysis

Authors: Clément Ballabriga, Julien Forget, and Giuseppe Lipari

Published in: OASIcs, Volume 47, 15th International Workshop on Worst-Case Execution Time Analysis (WCET 2015)


Abstract
In this paper, we propose aWCET analysis that focuses on two aspects. First, it supports contextsensitive hardware and software timing effects, meaning that it is sensitive to the execution history of the program and thus can account for effects like cache persistence, triangular loop, etc. Second, it supports the introduction of parameters in both the software model (e.g. parametric loop bounds) and the hardware model (e.g. number of cache misses). WCET computation by static analysis is traditionally handled by the Implicit Path Enumeration Technique (IPET), using an Integer Linear Program (ILP) that is difficult to resolve parametrically. We suggest an alternative tree-based approach. We define a context-sensitive CFG format to express these effects, and we provide an efficient method to process it, giving a parametric WCET formula. Experimental results show that this new method is significantly faster and more accurate than existing parametric approaches.

Cite as

Clément Ballabriga, Julien Forget, and Giuseppe Lipari. Context-sensitive Parametric WCET Analysis. In 15th International Workshop on Worst-Case Execution Time Analysis (WCET 2015). Open Access Series in Informatics (OASIcs), Volume 47, pp. 55-64, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@InProceedings{ballabriga_et_al:OASIcs.WCET.2015.55,
  author =	{Ballabriga, Cl\'{e}ment and Forget, Julien and Lipari, Giuseppe},
  title =	{{Context-sensitive Parametric WCET Analysis}},
  booktitle =	{15th International Workshop on Worst-Case Execution Time Analysis (WCET 2015)},
  pages =	{55--64},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-939897-95-8},
  ISSN =	{2190-6807},
  year =	{2015},
  volume =	{47},
  editor =	{Cazorla, Francisco J.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.WCET.2015.55},
  URN =		{urn:nbn:de:0030-drops-52569},
  doi =		{10.4230/OASIcs.WCET.2015.55},
  annote =	{Keywords: Parametric, WCET, Real-time, Static analysis}
}
Document
A general framework for the realistic analysis of sorting and searching algorithms. Application to some popular algorithms

Authors: Julien Clément, Thu Hien Nguyen Thi, and Brigitte Vallée

Published in: LIPIcs, Volume 20, 30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)


Abstract
We describe a general framework for realistic analysis of sorting and searching algorithms, and we apply it to the average-case analysis of five basic algorithms: three sorting algorithms (QuickSort, InsertionSort, BubbleSort) and two selection algorithms (QuickMin and SelectionMin). Usually, the analysis deals with the mean number of key comparisons, but, here, we view keys as words produced by the same source, which are compared via their symbols in the lexicographic order. The "realistic" cost of the algorithm is now the total number of symbol comparisons performed by the algorithm, and, in this context, the average-case analysis aims to providee stimates for the mean number of symbol comparisons used by the algorithm. For sorting algorithms, and with respect to key comparisons, the average-case complexity of QuickSort is asymptotic to 2n log n, InsertionSort to n^2/4 and BubbleSort to n^2/2. With respect to symbol comparisons, we prove that their average-case complexity becomes Theta(n log^2n), Theta(n^2), Theta (n^2 log n). For selection algorithms, and with respect to key comparisons, the average-case complexity of QuickMin is asymptotic to 2n, of SelectionMin is n - 1. With respect to symbol comparisons, we prove that their average-case complexity remains Theta(n). In these five cases, we describe the dominant constants which exhibit the probabilistic behaviour of the source (namely, entropy, and various notions of coincidence) with respect to the algorithm.

Cite as

Julien Clément, Thu Hien Nguyen Thi, and Brigitte Vallée. A general framework for the realistic analysis of sorting and searching algorithms. Application to some popular algorithms. In 30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 20, pp. 598-609, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{clement_et_al:LIPIcs.STACS.2013.598,
  author =	{Cl\'{e}ment, Julien and Nguyen Thi, Thu Hien and Vall\'{e}e, Brigitte},
  title =	{{A general framework for the realistic analysis of sorting and searching algorithms. Application to some popular algorithms}},
  booktitle =	{30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)},
  pages =	{598--609},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-50-7},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{20},
  editor =	{Portier, Natacha and Wilke, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2013.598},
  URN =		{urn:nbn:de:0030-drops-39681},
  doi =		{10.4230/LIPIcs.STACS.2013.598},
  annote =	{Keywords: Probabilistic analysis of algorithms, Sorting and searching algorithms, Pattern matching, Permutations, Information theory, Rice formula, Asymptotic e}
}
Document
Reverse Engineering Prefix Tables

Authors: Julien Clement, Maxime Crochemore, and Giuseppina Rindone

Published in: LIPIcs, Volume 3, 26th International Symposium on Theoretical Aspects of Computer Science (2009)


Abstract
The Prefix table of a string reports for each position the maximal length of its prefixes starting here. The Prefix table and its dual Suffix table are basic tools used in the design of the most efficient string-matching and pattern extraction algorithms. These tables can be computed in linear time independently of the alphabet size. We give an algorithmic characterisation of a Prefix table (it can be adapted to a Suffix table). Namely, the algorithm tests if an integer table of size $n$ is the Prefix table of some word and, if successful, it constructs the lexicographically smallest string having it as a Prefix table. We show that the alphabet of the string can be bounded to $\log_2 n$ letters. The overall algorithm runs in $O(n)$ time.

Cite as

Julien Clement, Maxime Crochemore, and Giuseppina Rindone. Reverse Engineering Prefix Tables. In 26th International Symposium on Theoretical Aspects of Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 3, pp. 289-300, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{clement_et_al:LIPIcs.STACS.2009.1825,
  author =	{Clement, Julien and Crochemore, Maxime and Rindone, Giuseppina},
  title =	{{Reverse Engineering Prefix Tables}},
  booktitle =	{26th International Symposium on Theoretical Aspects of Computer Science},
  pages =	{289--300},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-09-5},
  ISSN =	{1868-8969},
  year =	{2009},
  volume =	{3},
  editor =	{Albers, Susanne and Marion, Jean-Yves},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2009.1825},
  URN =		{urn:nbn:de:0030-drops-18258},
  doi =		{10.4230/LIPIcs.STACS.2009.1825},
  annote =	{Keywords: Design and analysis of algorithms, Algorithms on strings, Pattern matching, String matching, Combinatorics on words, Prefix table, Suffix table}
}
  • Refine by Author
  • 3 Clément, Julien
  • 2 Vallée, Brigitte
  • 1 Akhavi, Ali
  • 1 Akian, Marianne
  • 1 Ballabriga, Clément
  • Show More...

  • Refine by Classification
  • 1 Information systems → Data compression
  • 1 Mathematics of computing → Combinatorial algorithms
  • 1 Mathematics of computing → Combinatoric problems
  • 1 Mathematics of computing → Combinatorics on words
  • 1 Mathematics of computing → Discrete mathematics
  • Show More...

  • Refine by Keyword
  • 2 Pattern matching
  • 1 Algorithms on strings
  • 1 Analytic combinatorics
  • 1 Asymptotic e
  • 1 Beta law
  • Show More...

  • Refine by Type
  • 8 document

  • Refine by Publication Year
  • 1 2009
  • 1 2013
  • 1 2015
  • 1 2017
  • 1 2018
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail