51 Search Results for "C�gielski, Patrick"


Document
Computing Complexity Measures of Degenerate Graphs

Authors: Pål Grønås Drange, Patrick Greaves, Irene Muzi, and Felix Reidl

Published in: LIPIcs, Volume 285, 18th International Symposium on Parameterized and Exact Computation (IPEC 2023)


Abstract
We show that the VC-dimension of a graph can be computed in time n^{⌈log d+1⌉} d^{O(d)}, where d is the degeneracy of the input graph. The core idea of our algorithm is a data structure to efficiently query the number of vertices that see a specific subset of vertices inside of a (small) query set. The construction of this data structure takes time O(d2^dn), afterwards queries can be computed efficiently using fast Möbius inversion. This data structure turns out to be useful for a range of tasks, especially for finding bipartite patterns in degenerate graphs, and we outline an efficient algorithm for counting the number of times specific patterns occur in a graph. The largest factor in the running time of this algorithm is O(n^c), where c is a parameter of the pattern we call its left covering number. Concrete applications of this algorithm include counting the number of (non-induced) bicliques in linear time, the number of co-matchings in quadratic time, as well as a constant-factor approximation of the ladder index in linear time. Finally, we supplement our theoretical results with several implementations and run experiments on more than 200 real-world datasets - the largest of which has 8 million edges - where we obtain interesting insights into the VC-dimension of real-world networks.

Cite as

Pål Grønås Drange, Patrick Greaves, Irene Muzi, and Felix Reidl. Computing Complexity Measures of Degenerate Graphs. In 18th International Symposium on Parameterized and Exact Computation (IPEC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 285, pp. 14:1-14:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{drange_et_al:LIPIcs.IPEC.2023.14,
  author =	{Drange, P\r{a}l Gr{\o}n\r{a}s and Greaves, Patrick and Muzi, Irene and Reidl, Felix},
  title =	{{Computing Complexity Measures of Degenerate Graphs}},
  booktitle =	{18th International Symposium on Parameterized and Exact Computation (IPEC 2023)},
  pages =	{14:1--14:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-305-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{285},
  editor =	{Misra, Neeldhara and Wahlstr\"{o}m, Magnus},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2023.14},
  URN =		{urn:nbn:de:0030-drops-194333},
  doi =		{10.4230/LIPIcs.IPEC.2023.14},
  annote =	{Keywords: vc-dimension, datastructure, degeneracy, enumerating}
}
Document
History-Deterministic Vector Addition Systems

Authors: Sougata Bose, David Purser, and Patrick Totzke

Published in: LIPIcs, Volume 279, 34th International Conference on Concurrency Theory (CONCUR 2023)


Abstract
We consider history-determinism, a restricted form of non-determinism, for Vector Addition Systems with States (VASS) when used as acceptors to recognise languages of finite words. History-determinism requires that the non-deterministic choices can be resolved on-the-fly; based on the past and without jeopardising acceptance of any possible continuation of the input word. Our results show that the history-deterministic (HD) VASS sit strictly between deterministic and non-deterministic VASS regardless of the number of counters. We compare the relative expressiveness of HD systems, and closure-properties of the induced language classes, with coverability and reachability semantics, and with and without ε-labelled transitions. Whereas in dimension 1, inclusion and regularity remain decidable, from dimension two onwards, HD-VASS with suitable resolver strategies, are essentially able to simulate 2-counter Minsky machines, leading to several undecidability results: It is undecidable whether a VASS is history-deterministic, or if a language equivalent history-deterministic VASS exists. Checking language inclusion between history-deterministic 2-VASS is also undecidable.

Cite as

Sougata Bose, David Purser, and Patrick Totzke. History-Deterministic Vector Addition Systems. In 34th International Conference on Concurrency Theory (CONCUR 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 279, pp. 18:1-18:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bose_et_al:LIPIcs.CONCUR.2023.18,
  author =	{Bose, Sougata and Purser, David and Totzke, Patrick},
  title =	{{History-Deterministic Vector Addition Systems}},
  booktitle =	{34th International Conference on Concurrency Theory (CONCUR 2023)},
  pages =	{18:1--18:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-299-0},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{279},
  editor =	{P\'{e}rez, Guillermo A. and Raskin, Jean-Fran\c{c}ois},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CONCUR.2023.18},
  URN =		{urn:nbn:de:0030-drops-190120},
  doi =		{10.4230/LIPIcs.CONCUR.2023.18},
  annote =	{Keywords: Vector Addition Systems, History-determinism, Good-for Games}
}
Document
Quantum Algorithm for Stochastic Optimal Stopping Problems with Applications in Finance

Authors: João F. Doriguello, Alessandro Luongo, Jinge Bao, Patrick Rebentrost, and Miklos Santha

Published in: LIPIcs, Volume 232, 17th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2022)


Abstract
The famous least squares Monte Carlo (LSM) algorithm combines linear least square regression with Monte Carlo simulation to approximately solve problems in stochastic optimal stopping theory. In this work, we propose a quantum LSM based on quantum access to a stochastic process, on quantum circuits for computing the optimal stopping times, and on quantum techniques for Monte Carlo. For this algorithm, we elucidate the intricate interplay of function approximation and quantum algorithms for Monte Carlo. Our algorithm achieves a nearly quadratic speedup in the runtime compared to the LSM algorithm under some mild assumptions. Specifically, our quantum algorithm can be applied to American option pricing and we analyze a case study for the common situation of Brownian motion and geometric Brownian motion processes.

Cite as

João F. Doriguello, Alessandro Luongo, Jinge Bao, Patrick Rebentrost, and Miklos Santha. Quantum Algorithm for Stochastic Optimal Stopping Problems with Applications in Finance. In 17th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 232, pp. 2:1-2:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{doriguello_et_al:LIPIcs.TQC.2022.2,
  author =	{Doriguello, Jo\~{a}o F. and Luongo, Alessandro and Bao, Jinge and Rebentrost, Patrick and Santha, Miklos},
  title =	{{Quantum Algorithm for Stochastic Optimal Stopping Problems with Applications in Finance}},
  booktitle =	{17th Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2022)},
  pages =	{2:1--2:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-237-2},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{232},
  editor =	{Le Gall, Fran\c{c}ois and Morimae, Tomoyuki},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.TQC.2022.2},
  URN =		{urn:nbn:de:0030-drops-165091},
  doi =		{10.4230/LIPIcs.TQC.2022.2},
  annote =	{Keywords: Quantum computation complexity, optimal stopping time, stochastic processes, American options, quantum finance}
}
Document
Is Smaller Always Better? - Evaluating Video Compression Techniques for Simulation Ensembles

Authors: Patrick Ruediger, Christoph Garth, Hans Hagen, and Heike Leitte

Published in: OASIcs, Volume 89, 2nd International Conference of the DFG International Research Training Group 2057 – Physical Modeling for Virtual Manufacturing (iPMVM 2020)


Abstract
We provide an evaluation of the applicability of video compression techniques for compressing visualization image databases that are often used for in situ visualization. Considering relevant practical implementation aspects, we identify relevant compression parameters, and evaluate video compression for several test cases, involving several data sets and visualization methods; we use three different video codecs. To quantify the benefits and drawbacks of video compression, we employ metrics for image quality, compression rate, and performance. The experiments discussed provide insight into good choices of parameter values, working well in the considered cases.

Cite as

Patrick Ruediger, Christoph Garth, Hans Hagen, and Heike Leitte. Is Smaller Always Better? - Evaluating Video Compression Techniques for Simulation Ensembles. In 2nd International Conference of the DFG International Research Training Group 2057 – Physical Modeling for Virtual Manufacturing (iPMVM 2020). Open Access Series in Informatics (OASIcs), Volume 89, pp. 10:1-10:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{ruediger_et_al:OASIcs.iPMVM.2020.10,
  author =	{Ruediger, Patrick and Garth, Christoph and Hagen, Hans and Leitte, Heike},
  title =	{{Is Smaller Always Better? - Evaluating Video Compression Techniques for Simulation Ensembles}},
  booktitle =	{2nd International Conference of the DFG International Research Training Group 2057 – Physical Modeling for Virtual Manufacturing (iPMVM 2020)},
  pages =	{10:1--10:18},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-183-2},
  ISSN =	{2190-6807},
  year =	{2021},
  volume =	{89},
  editor =	{Garth, Christoph and Aurich, Jan C. and Linke, Barbara and M\"{u}ller, Ralf and Ravani, Bahram and Weber, Gunther H. and Kirsch, Benjamin},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.iPMVM.2020.10},
  URN =		{urn:nbn:de:0030-drops-137591},
  doi =		{10.4230/OASIcs.iPMVM.2020.10},
  annote =	{Keywords: Image Database, CinemaDB, Video Compression, Evaluation, Benchmark, In-situ}
}
Document
The Quantum Supremacy Tsirelson Inequality

Authors: William Kretschmer

Published in: LIPIcs, Volume 185, 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)


Abstract
A leading proposal for verifying near-term quantum supremacy experiments on noisy random quantum circuits is linear cross-entropy benchmarking. For a quantum circuit C on n qubits and a sample z ∈ {0,1}ⁿ, the benchmark involves computing |⟨z|C|0ⁿ⟩|², i.e. the probability of measuring z from the output distribution of C on the all zeros input. Under a strong conjecture about the classical hardness of estimating output probabilities of quantum circuits, no polynomial-time classical algorithm given C can output a string z such that |⟨z|C|0ⁿ⟩|² is substantially larger than 1/(2ⁿ) (Aaronson and Gunn, 2019). On the other hand, for a random quantum circuit C, sampling z from the output distribution of C achieves |⟨z|C|0ⁿ⟩|² ≈ 2/(2ⁿ) on average (Arute et al., 2019). In analogy with the Tsirelson inequality from quantum nonlocal correlations, we ask: can a polynomial-time quantum algorithm do substantially better than 2/(2ⁿ)? We study this question in the query (or black box) model, where the quantum algorithm is given oracle access to C. We show that, for any ε ≥ 1/poly(n), outputting a sample z such that |⟨z|C|0ⁿ⟩|² ≥ (2 + ε)/2ⁿ on average requires at least Ω((2^{n/4})/poly(n)) queries to C, but not more than O (2^{n/3}) queries to C, if C is either a Haar-random n-qubit unitary, or a canonical state preparation oracle for a Haar-random n-qubit state. We also show that when C samples from the Fourier distribution of a random Boolean function, the naive algorithm that samples from C is the optimal 1-query algorithm for maximizing |⟨z|C|0ⁿ⟩|² on average.

Cite as

William Kretschmer. The Quantum Supremacy Tsirelson Inequality. In 12th Innovations in Theoretical Computer Science Conference (ITCS 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 185, pp. 13:1-13:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{kretschmer:LIPIcs.ITCS.2021.13,
  author =	{Kretschmer, William},
  title =	{{The Quantum Supremacy Tsirelson Inequality}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{13:1--13:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{Lee, James R.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2021.13},
  URN =		{urn:nbn:de:0030-drops-135524},
  doi =		{10.4230/LIPIcs.ITCS.2021.13},
  annote =	{Keywords: quantum supremacy, quantum query complexity, random circuit sampling}
}
Document
Extending the Centerpoint Theorem to Multiple Points

Authors: Alexander Pilz and Patrick Schnider

Published in: LIPIcs, Volume 123, 29th International Symposium on Algorithms and Computation (ISAAC 2018)


Abstract
The centerpoint theorem is a well-known and widely used result in discrete geometry. It states that for any point set P of n points in R^d, there is a point c, not necessarily from P, such that each halfspace containing c contains at least n/(d+1) points of P. Such a point c is called a centerpoint, and it can be viewed as a generalization of a median to higher dimensions. In other words, a centerpoint can be interpreted as a good representative for the point set P. But what if we allow more than one representative? For example in one-dimensional data sets, often certain quantiles are chosen as representatives instead of the median. We present a possible extension of the concept of quantiles to higher dimensions. The idea is to find a set Q of (few) points such that every halfspace that contains one point of Q contains a large fraction of the points of P and every halfspace that contains more of Q contains an even larger fraction of P. This setting is comparable to the well-studied concepts of weak epsilon-nets and weak epsilon-approximations, where it is stronger than the former but weaker than the latter. We show that for any point set of size n in R^d and for any positive alpha_1,...,alpha_k where alpha_1 <= alpha_2 <= ... <= alpha_k and for every i,j with i+j <= k+1 we have that (d-1)alpha_k+alpha_i+alpha_j <= 1, we can find Q of size k such that each halfspace containing j points of Q contains least alpha_j n points of P. For two-dimensional point sets we further show that for every alpha and beta with alpha <= beta and alpha+beta <= 2/3 we can find Q with |Q|=3 such that each halfplane containing one point of Q contains at least alpha n of the points of P and each halfplane containing all of Q contains at least beta n points of P. All these results generalize to the setting where P is any mass distribution. For the case where P is a point set in R^2 and |Q|=2, we provide algorithms to find such points in time O(n log^3 n).

Cite as

Alexander Pilz and Patrick Schnider. Extending the Centerpoint Theorem to Multiple Points. In 29th International Symposium on Algorithms and Computation (ISAAC 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 123, pp. 53:1-53:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{pilz_et_al:LIPIcs.ISAAC.2018.53,
  author =	{Pilz, Alexander and Schnider, Patrick},
  title =	{{Extending the Centerpoint Theorem to Multiple Points}},
  booktitle =	{29th International Symposium on Algorithms and Computation (ISAAC 2018)},
  pages =	{53:1--53:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-094-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{123},
  editor =	{Hsu, Wen-Lian and Lee, Der-Tsai and Liao, Chung-Shou},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2018.53},
  URN =		{urn:nbn:de:0030-drops-100019},
  doi =		{10.4230/LIPIcs.ISAAC.2018.53},
  annote =	{Keywords: centerpoint, point sets, Tukey depth}
}
Document
C++ const and Immutability: An Empirical Study of Writes-Through-const

Authors: Jon Eyolfson and Patrick Lam

Published in: LIPIcs, Volume 56, 30th European Conference on Object-Oriented Programming (ECOOP 2016)


Abstract
The ability to specify immutability in a programming language is a powerful tool for developers, enabling them to better understand and more safely transform their code without fearing unintended changes to program state. The C++ programming language allows developers to specify a form of immutability using the const keyword. In this work, we characterize the meaning of the C++ const qualifier and present the ConstSanitizer tool, which dynamically verifies a stricter form of immutability than that defined in C++: it identifies const uses that are either not consistent with transitive immutability, that write to mutable fields, or that write to formerly-const objects whose const-ness has been cast away. We evaluate a set of 7 C++ benchmark programs to find writes-through-const, establish root causes for how they fail to respect our stricter definition of immutability, and assign attributes to each write (namely: synchronized, not visible, buffer/cache, delayed initialization, and incorrect). ConstSanitizer finds 17 archetypes for writes in these programs which do not respect our version of immutability. Over half of these seem unnecessary to us. Our classification and observations of behaviour in practice contribute to the understanding of a widely-used C++ language feature.

Cite as

Jon Eyolfson and Patrick Lam. C++ const and Immutability: An Empirical Study of Writes-Through-const. In 30th European Conference on Object-Oriented Programming (ECOOP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 56, pp. 8:1-8:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{eyolfson_et_al:LIPIcs.ECOOP.2016.8,
  author =	{Eyolfson, Jon and Lam, Patrick},
  title =	{{C++ const and Immutability: An Empirical Study of Writes-Through-const}},
  booktitle =	{30th European Conference on Object-Oriented Programming (ECOOP 2016)},
  pages =	{8:1--8:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-014-9},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{56},
  editor =	{Krishnamurthi, Shriram and Lerner, Benjamin S.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ECOOP.2016.8},
  URN =		{urn:nbn:de:0030-drops-61024},
  doi =		{10.4230/LIPIcs.ECOOP.2016.8},
  annote =	{Keywords: empirical study, dynamic analysis, immutability}
}
Document
C++ const and Immutability: An Empirical Study of Writes-Through-const (Artifact)

Authors: Jon Eyolfson and Patrick Lam

Published in: DARTS, Volume 2, Issue 1, Special Issue of the 30th European Conference on Object-Oriented Programming (ECOOP 2016)


Abstract
This artifact is based on ConstSanitizer, a dynamic program analysis tool that detects deep immutability violations through const qualifiers. Our tool instruments any code compiled by clang with the -fsanitizer=const flag. Our implementation includes both instrumentation of LLVM code and a runtime library to support our analysis. The provided package includes our tool and all experiments used in our companion paper. Instructions are also provided.

Cite as

Jon Eyolfson and Patrick Lam. C++ const and Immutability: An Empirical Study of Writes-Through-const (Artifact). In Special Issue of the 30th European Conference on Object-Oriented Programming (ECOOP 2016). Dagstuhl Artifacts Series (DARTS), Volume 2, Issue 1, pp. 3:1-3:2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@Article{eyolfson_et_al:DARTS.2.1.3,
  author =	{Eyolfson, Jon and Lam, Patrick},
  title =	{{C++ const and Immutability: An Empirical Study of Writes-Through-const (Artifact)}},
  pages =	{3:1--3:2},
  journal =	{Dagstuhl Artifacts Series},
  ISSN =	{2509-8195},
  year =	{2016},
  volume =	{2},
  number =	{1},
  editor =	{Eyolfson, Jon and Lam, Patrick},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DARTS.2.1.3},
  URN =		{urn:nbn:de:0030-drops-61249},
  doi =		{10.4230/DARTS.2.1.3},
  annote =	{Keywords: empirical study, dynamic analysis, immutability}
}
Document
Complete Volume
LIPIcs, Volume 16, CSL'12, Complete Volume

Authors: Patrick Cégielski and Arnaud Durand

Published in: LIPIcs, Volume 16, Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL (2012)


Abstract
LIPIcs, Volume 16, CSL'12, Complete Volume

Cite as

Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL. Leibniz International Proceedings in Informatics (LIPIcs), Volume 16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@Proceedings{cegielski_et_al:LIPIcs.CSL.2012,
  title =	{{LIPIcs, Volume 16, CSL'12, Complete Volume}},
  booktitle =	{Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-42-2},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{16},
  editor =	{C\'{e}gielski, Patrick and Durand, Arnaud},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2012},
  URN =		{urn:nbn:de:0030-drops-41106},
  doi =		{10.4230/LIPIcs.CSL.2012},
  annote =	{Keywords: Conference Proceedings; Software; Theory of Computation; Graph Theory; Probability and Statistics; Artificial Intelligence}
}
Document
Front Matter
Frontmatter, Table of Contents, Preface, Conference Organization

Authors: Patrick Cégielski and Arnaud Durand

Published in: LIPIcs, Volume 16, Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL (2012)


Abstract
Frontmatter, Table of Contents, Preface, Conference Organization

Cite as

Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL. Leibniz International Proceedings in Informatics (LIPIcs), Volume 16, pp. i-xiv, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{cegielski_et_al:LIPIcs.CSL.2012.i,
  author =	{C\'{e}gielski, Patrick and Durand, Arnaud},
  title =	{{Frontmatter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL},
  pages =	{i--xiv},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-42-2},
  ISSN =	{1868-8969},
  year =	{2012},
  volume =	{16},
  editor =	{C\'{e}gielski, Patrick and Durand, Arnaud},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2012.i},
  URN =		{urn:nbn:de:0030-drops-36559},
  doi =		{10.4230/LIPIcs.CSL.2012.i},
  annote =	{Keywords: Frontmatter, Table of Contents, Preface, Conference Organization}
}
Document
The Ackermann Award 2012

Authors: Thierry Coquand, Anuj Dawar, and Damian Niwinski

Published in: LIPIcs, Volume 16, Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL (2012)


Abstract
Report on the Ackermann Award 2012.

Cite as

Thierry Coquand, Anuj Dawar, and Damian Niwinski. The Ackermann Award 2012. In Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL. Leibniz International Proceedings in Informatics (LIPIcs), Volume 16, pp. 1-5, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{coquand_et_al:LIPIcs.CSL.2012.1,
  author =	{Coquand, Thierry and Dawar, Anuj and Niwinski, Damian},
  title =	{{The Ackermann Award 2012}},
  booktitle =	{Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL},
  pages =	{1--5},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-42-2},
  ISSN =	{1868-8969},
  year =	{2012},
  volume =	{16},
  editor =	{C\'{e}gielski, Patrick and Durand, Arnaud},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2012.1},
  URN =		{urn:nbn:de:0030-drops-36575},
  doi =		{10.4230/LIPIcs.CSL.2012.1},
  annote =	{Keywords: Ackermann Award 2012}
}
Document
Invited Talk
Sharing Distributed Knowledge on the Web (Invited Talk)

Authors: Serge Abiteboul

Published in: LIPIcs, Volume 16, Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL (2012)


Abstract
To share information, we propose to see the Web as a knowledge base consisting of distributed logical facts and rules. Our objective is to help Web users finding information, as well as controlling their own, using automated reasoning over this knowledge base towards improving the quality of service and of data. For this, we introduce Webdamlog, a Datalog-style language with rule delegation. We mention the implementation of a system to support this language as well as standard communications and security protocols.

Cite as

Serge Abiteboul. Sharing Distributed Knowledge on the Web (Invited Talk). In Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL. Leibniz International Proceedings in Informatics (LIPIcs), Volume 16, pp. 6-8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{abiteboul:LIPIcs.CSL.2012.6,
  author =	{Abiteboul, Serge},
  title =	{{Sharing Distributed Knowledge on the Web}},
  booktitle =	{Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL},
  pages =	{6--8},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-42-2},
  ISSN =	{1868-8969},
  year =	{2012},
  volume =	{16},
  editor =	{C\'{e}gielski, Patrick and Durand, Arnaud},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2012.6},
  URN =		{urn:nbn:de:0030-drops-36584},
  doi =		{10.4230/LIPIcs.CSL.2012.6},
  annote =	{Keywords: Knowledge base, distributed data, world wide web, deduction}
}
Document
Invited Talk
Connecting Complexity Classes, Weak Formal Theories, and Propositional Proof Systems (Invited Talk)

Authors: Stephen A. Cook

Published in: LIPIcs, Volume 16, Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL (2012)


Abstract
This is a survey talk explaining the connection between the three items mentioned in the title.

Cite as

Stephen A. Cook. Connecting Complexity Classes, Weak Formal Theories, and Propositional Proof Systems (Invited Talk). In Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL. Leibniz International Proceedings in Informatics (LIPIcs), Volume 16, pp. 9-11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{cook:LIPIcs.CSL.2012.9,
  author =	{Cook, Stephen A.},
  title =	{{Connecting Complexity Classes, Weak Formal Theories, and Propositional Proof Systems}},
  booktitle =	{Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL},
  pages =	{9--11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-42-2},
  ISSN =	{1868-8969},
  year =	{2012},
  volume =	{16},
  editor =	{C\'{e}gielski, Patrick and Durand, Arnaud},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2012.9},
  URN =		{urn:nbn:de:0030-drops-36594},
  doi =		{10.4230/LIPIcs.CSL.2012.9},
  annote =	{Keywords: Complexity Classes, Weak Formal Theories, Propositional Proof Systems}
}
Document
Invited Talk
Satisfiability: where Theory meets Practice (Invited Talk)

Authors: Inês Lynce

Published in: LIPIcs, Volume 16, Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL (2012)


Abstract
Propositional Satisfiability (SAT) is a keystone in the history of computer science. SAT was the first problem shown to be NP-complete in 1971 by Stephen Cook. Having passed more than 40 years from then, SAT is now a lively research field where theory and practice have a natural intermixing. In this talk, we overview the use of SAT in practical domains, where SAT is thought in a broad sense, i.e. including SAT extensions such as Maximum Satisfiability (MaxSAT), Pseudo-Boolean Optimization (PBO) and Quantified Boolean Formulas (QBF).

Cite as

Inês Lynce. Satisfiability: where Theory meets Practice (Invited Talk). In Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL. Leibniz International Proceedings in Informatics (LIPIcs), Volume 16, pp. 12-13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{lynce:LIPIcs.CSL.2012.12,
  author =	{Lynce, In\^{e}s},
  title =	{{Satisfiability: where Theory meets Practice}},
  booktitle =	{Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL},
  pages =	{12--13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-42-2},
  ISSN =	{1868-8969},
  year =	{2012},
  volume =	{16},
  editor =	{C\'{e}gielski, Patrick and Durand, Arnaud},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2012.12},
  URN =		{urn:nbn:de:0030-drops-36600},
  doi =		{10.4230/LIPIcs.CSL.2012.12},
  annote =	{Keywords: Propositional Satisfiability, SAT solvers, Applications}
}
Document
Invited Talk
Definability and Complexity of Graph Parameters (Invited Talk)

Authors: Johann A. Makowsky

Published in: LIPIcs, Volume 16, Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL (2012)


Abstract
In this talk we survey definability and complexity results of graph parameters which take values in some ring or field R.

Cite as

Johann A. Makowsky. Definability and Complexity of Graph Parameters (Invited Talk). In Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL. Leibniz International Proceedings in Informatics (LIPIcs), Volume 16, pp. 14-15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{makowsky:LIPIcs.CSL.2012.14,
  author =	{Makowsky, Johann A.},
  title =	{{Definability and Complexity of Graph Parameters}},
  booktitle =	{Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL},
  pages =	{14--15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-42-2},
  ISSN =	{1868-8969},
  year =	{2012},
  volume =	{16},
  editor =	{C\'{e}gielski, Patrick and Durand, Arnaud},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2012.14},
  URN =		{urn:nbn:de:0030-drops-36612},
  doi =		{10.4230/LIPIcs.CSL.2012.14},
  annote =	{Keywords: Model theory, finite model theory, graph invariants}
}
  • Refine by Author
  • 2 Cégielski, Patrick
  • 2 Dawar, Anuj
  • 2 Durand, Arnaud
  • 2 Eyolfson, Jon
  • 2 Grädel, Erich
  • Show More...

  • Refine by Classification
  • 1 Applied computing → Engineering
  • 1 Applied computing → Physics
  • 1 Computing methodologies → Image compression
  • 1 Mathematics of computing → Graph algorithms
  • 1 Mathematics of computing → Markov-chain Monte Carlo methods
  • Show More...

  • Refine by Keyword
  • 2 Classical Logic
  • 2 Model theory
  • 2 Temporal Logic
  • 2 dynamic analysis
  • 2 empirical study
  • Show More...

  • Refine by Type
  • 51 document

  • Refine by Publication Year
  • 41 2012
  • 2 2016
  • 2 2021
  • 2 2023
  • 1 2009
  • Show More...

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail