9 Search Results for "Díaz, Josep"


Document
Invited Talk
Meaningfulness and Genericity in a Subsuming Framework (Invited Talk)

Authors: Delia Kesner, Victor Arrial, and Giulio Guerrieri

Published in: LIPIcs, Volume 299, 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)


Abstract
This paper studies the notion of meaningfulness for a unifying framework called dBang-calculus, which subsumes both call-by-name (dCBN) and call-by-value (dCBV). We first define meaningfulness in dBang and then characterize it by means of typability and inhabitation in an associated non-idempotent intersection type system previously appearing in the literature. We validate the proposed notion of meaningfulness by showing two properties: (1) consistency of the smallest theory, called ℋ, equating all meaningless terms, and (2) genericity, stating that meaningless subterms have no bearing on the significance of meaningful terms. The theory ℋ is also shown to have a unique consistent and maximal extension ℋ*, which coincides with a well-known notion of observational equivalence. Last but not least, we show that the notions of meaningfulness and genericity in the literature for dCBN and dCBV are subsumed by the corresponding ones proposed here for the dBang-calculus.

Cite as

Delia Kesner, Victor Arrial, and Giulio Guerrieri. Meaningfulness and Genericity in a Subsuming Framework (Invited Talk). In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 1:1-1:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{kesner_et_al:LIPIcs.FSCD.2024.1,
  author =	{Kesner, Delia and Arrial, Victor and Guerrieri, Giulio},
  title =	{{Meaningfulness and Genericity in a Subsuming Framework}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{1:1--1:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.1},
  URN =		{urn:nbn:de:0030-drops-203305},
  doi =		{10.4230/LIPIcs.FSCD.2024.1},
  annote =	{Keywords: Lambda calculus, Solvability, Meaningfulness, Inhabitation, Genericity}
}
Document
Mirroring Call-By-Need, or Values Acting Silly

Authors: Beniamino Accattoli and Adrienne Lancelot

Published in: LIPIcs, Volume 299, 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)


Abstract
Call-by-need evaluation for the λ-calculus can be seen as merging the best of call-by-name and call-by-value, namely the wise erasing behaviour of the former and the wise duplicating behaviour of the latter. To better understand how duplication and erasure can be combined, we design a degenerated calculus, dubbed call-by-silly, that is symmetric to call-by-need in that it merges the worst of call-by-name and call-by-value, namely silly duplications by-name and silly erasures by-value. We validate the design of the call-by-silly calculus via rewriting properties and multi types. In particular, we mirror the main theorem about call-by-need - that is, its operational equivalence with call-by-name - showing that call-by-silly and call-by-value induce the same contextual equivalence. This fact shows the blindness with respect to efficiency of call-by-value contextual equivalence. We also define a call-by-silly strategy and measure its length via tight multi types. Lastly, we prove that the call-by-silly strategy computes evaluation sequences of maximal length in the calculus.

Cite as

Beniamino Accattoli and Adrienne Lancelot. Mirroring Call-By-Need, or Values Acting Silly. In 9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 299, pp. 23:1-23:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{accattoli_et_al:LIPIcs.FSCD.2024.23,
  author =	{Accattoli, Beniamino and Lancelot, Adrienne},
  title =	{{Mirroring Call-By-Need, or Values Acting Silly}},
  booktitle =	{9th International Conference on Formal Structures for Computation and Deduction (FSCD 2024)},
  pages =	{23:1--23:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-323-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{299},
  editor =	{Rehof, Jakob},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2024.23},
  URN =		{urn:nbn:de:0030-drops-203527},
  doi =		{10.4230/LIPIcs.FSCD.2024.23},
  annote =	{Keywords: Lambda calculus, intersection types, call-by-value, call-by-need}
}
Document
Track A: Algorithms, Complexity and Games
The k-Opt Algorithm for the Traveling Salesman Problem Has Exponential Running Time for k ≥ 5

Authors: Sophia Heimann, Hung P. Hoang, and Stefan Hougardy

Published in: LIPIcs, Volume 297, 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)


Abstract
The k-Opt algorithm is a local search algorithm for the Traveling Salesman Problem. Starting with an initial tour, it iteratively replaces at most k edges in the tour with the same number of edges to obtain a better tour. Krentel (FOCS 1989) showed that the Traveling Salesman Problem with the k-Opt neighborhood is complete for the class PLS (polynomial time local search) and that the k-Opt algorithm can have exponential running time for any pivot rule. However, his proof requires k ≫ 1000 and has a substantial gap. We show the two properties above for a much smaller value of k, addressing an open question by Monien, Dumrauf, and Tscheuschner (ICALP 2010). In particular, we prove the PLS-completeness for k ≥ 17 and the exponential running time for k ≥ 5.

Cite as

Sophia Heimann, Hung P. Hoang, and Stefan Hougardy. The k-Opt Algorithm for the Traveling Salesman Problem Has Exponential Running Time for k ≥ 5. In 51st International Colloquium on Automata, Languages, and Programming (ICALP 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 297, pp. 84:1-84:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{heimann_et_al:LIPIcs.ICALP.2024.84,
  author =	{Heimann, Sophia and Hoang, Hung P. and Hougardy, Stefan},
  title =	{{The k-Opt Algorithm for the Traveling Salesman Problem Has Exponential Running Time for k ≥ 5}},
  booktitle =	{51st International Colloquium on Automata, Languages, and Programming (ICALP 2024)},
  pages =	{84:1--84:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-322-5},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{297},
  editor =	{Bringmann, Karl and Grohe, Martin and Puppis, Gabriele and Svensson, Ola},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2024.84},
  URN =		{urn:nbn:de:0030-drops-202270},
  doi =		{10.4230/LIPIcs.ICALP.2024.84},
  annote =	{Keywords: Traveling Salesman Problem, k-Opt algorithm, PLS-completeness}
}
Document
Current and Future Challenges in Knowledge Representation and Reasoning (Dagstuhl Perspectives Workshop 22282)

Authors: James P. Delgrande, Birte Glimm, Thomas Meyer, Miroslaw Truszczynski, and Frank Wolter

Published in: Dagstuhl Manifestos, Volume 10, Issue 1 (2024)


Abstract
Knowledge Representation and Reasoning is a central, longstanding, and active area of Artificial Intelligence. Over the years it has evolved significantly; more recently it has been challenged and complemented by research in areas such as machine learning and reasoning under uncertainty. In July 2022,sser a Dagstuhl Perspectives workshop was held on Knowledge Representation and Reasoning. The goal of the workshop was to describe the state of the art in the field, including its relation with other areas, its shortcomings and strengths, together with recommendations for future progress. We developed this manifesto based on the presentations, panels, working groups, and discussions that took place at the Dagstuhl Workshop. It is a declaration of our views on Knowledge Representation: its origins, goals, milestones, and current foci; its relation to other disciplines, especially to Artificial Intelligence; and on its challenges, along with key priorities for the next decade.

Cite as

James P. Delgrande, Birte Glimm, Thomas Meyer, Miroslaw Truszczynski, and Frank Wolter. Current and Future Challenges in Knowledge Representation and Reasoning (Dagstuhl Perspectives Workshop 22282). In Dagstuhl Manifestos, Volume 10, Issue 1, pp. 1-61, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@Article{delgrande_et_al:DagMan.10.1.1,
  author =	{Delgrande, James P. and Glimm, Birte and Meyer, Thomas and Truszczynski, Miroslaw and Wolter, Frank},
  title =	{{Current and Future Challenges in Knowledge Representation and Reasoning (Dagstuhl Perspectives Workshop 22282)}},
  pages =	{1--61},
  journal =	{Dagstuhl Manifestos},
  ISSN =	{2193-2433},
  year =	{2024},
  volume =	{10},
  number =	{1},
  editor =	{Delgrande, James P. and Glimm, Birte and Meyer, Thomas and Truszczynski, Miroslaw and Wolter, Frank},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/DagMan.10.1.1},
  URN =		{urn:nbn:de:0030-drops-201403},
  doi =		{10.4230/DagMan.10.1.1},
  annote =	{Keywords: Knowledge representation and reasoning, Applications of logics, Declarative representations, Formal logic}
}
Document
Track A: Algorithms, Complexity and Games
Improved Reconstruction of Random Geometric Graphs

Authors: Varsha Dani, Josep Díaz, Thomas P. Hayes, and Cristopher Moore

Published in: LIPIcs, Volume 229, 49th International Colloquium on Automata, Languages, and Programming (ICALP 2022)


Abstract
Embedding graphs in a geographical or latent space, i.e. inferring locations for vertices in Euclidean space or on a smooth manifold or submanifold, is a common task in network analysis, statistical inference, and graph visualization. We consider the classic model of random geometric graphs where n points are scattered uniformly in a square of area n, and two points have an edge between them if and only if their Euclidean distance is less than r. The reconstruction problem then consists of inferring the vertex positions, up to the symmetries of the square, given only the adjacency matrix of the resulting graph. We give an algorithm that, if r = n^α for α > 0, with high probability reconstructs the vertex positions with a maximum error of O(n^β) where β = 1/2-(4/3)α, until α ≥ 3/8 where β = 0 and the error becomes O(√{log n}). This improves over earlier results, which were unable to reconstruct with error less than r. Our method estimates Euclidean distances using a hybrid of graph distances and short-range estimates based on the number of common neighbors. We extend our results to the surface of the sphere in ℝ³ and to hypercubes in any constant dimension.

Cite as

Varsha Dani, Josep Díaz, Thomas P. Hayes, and Cristopher Moore. Improved Reconstruction of Random Geometric Graphs. In 49th International Colloquium on Automata, Languages, and Programming (ICALP 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 229, pp. 48:1-48:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{dani_et_al:LIPIcs.ICALP.2022.48,
  author =	{Dani, Varsha and D{\'\i}az, Josep and Hayes, Thomas P. and Moore, Cristopher},
  title =	{{Improved Reconstruction of Random Geometric Graphs}},
  booktitle =	{49th International Colloquium on Automata, Languages, and Programming (ICALP 2022)},
  pages =	{48:1--48:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-235-8},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{229},
  editor =	{Boja\'{n}czyk, Miko{\l}aj and Merelli, Emanuela and Woodruff, David P.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2022.48},
  URN =		{urn:nbn:de:0030-drops-163897},
  doi =		{10.4230/LIPIcs.ICALP.2022.48},
  annote =	{Keywords: Reconstruction algorithm, distances in RGG, d-dimensional hypercube, 3 dimensional sphere}
}
Document
RANDOM
The Expected Number of Maximal Points of the Convolution of Two 2-D Distributions

Authors: Josep Diaz and Mordecai Golin

Published in: LIPIcs, Volume 145, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2019)


Abstract
The Maximal points in a set S are those that are not dominated by any other point in S. Such points arise in multiple application settings and are called by a variety of different names, e.g., maxima, Pareto optimums, skylines. Their ubiquity has inspired a large literature on the expected number of maxima in a set S of n points chosen IID from some distribution. Most such results assume that the underlying distribution is uniform over some spatial region and strongly use this uniformity in their analysis. This research was initially motivated by the question of how this expected number changes if the input distribution is perturbed by random noise. More specifically, let B_p denote the uniform distribution from the 2-dimensional unit ball in the metric L_p. Let delta B_q denote the 2-dimensional L_q-ball, of radius delta and B_p + delta B_q be the convolution of the two distributions, i.e., a point v in B_p is reported with an error chosen from delta B_q. The question is how the expected number of maxima change as a function of delta. Although the original motivation is for small delta, the problem is well defined for any delta and our analysis treats the general case. More specifically, we study, as a function of n,delta, the expected number of maximal points when the n points in S are chosen IID from distributions of the type B_p + delta B_q where p,q in {1,2,infty} for delta > 0 and also of the type B_infty + delta B_q where q in [1,infty) for delta > 0. For fixed p,q we show that this function changes "smoothly" as a function of delta but that this smooth behavior sometimes transitions unexpectedly between different growth behaviors.

Cite as

Josep Diaz and Mordecai Golin. The Expected Number of Maximal Points of the Convolution of Two 2-D Distributions. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 145, pp. 35:1-35:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{diaz_et_al:LIPIcs.APPROX-RANDOM.2019.35,
  author =	{Diaz, Josep and Golin, Mordecai},
  title =	{{The Expected Number of Maximal Points of the Convolution of Two 2-D Distributions}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2019)},
  pages =	{35:1--35:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-125-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{145},
  editor =	{Achlioptas, Dimitris and V\'{e}gh, L\'{a}szl\'{o} A.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX-RANDOM.2019.35},
  URN =		{urn:nbn:de:0030-drops-112501},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2019.35},
  annote =	{Keywords: maximal points, probabilistic geometry, perturbations, Minkowski sum}
}
Document
Track A: Algorithms, Complexity and Games
Algorithmically Efficient Syntactic Characterization of Possibility Domains

Authors: Josep Díaz, Lefteris Kirousis, Sofia Kokonezi, and John Livieratos

Published in: LIPIcs, Volume 132, 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)


Abstract
We call domain any arbitrary subset of a Cartesian power of the set {0,1} when we think of it as reflecting abstract rationality restrictions on vectors of two-valued judgments on a number of issues. In Computational Social Choice Theory, and in particular in the theory of judgment aggregation, a domain is called a possibility domain if it admits a non-dictatorial aggregator, i.e. if for some k there exists a unanimous (idempotent) function F:D^k - > D which is not a projection function. We prove that a domain is a possibility domain if and only if there is a propositional formula of a certain syntactic form, sometimes called an integrity constraint, whose set of satisfying truth assignments, or models, comprise the domain. We call possibility integrity constraints the formulas of the specific syntactic type we define. Given a possibility domain D, we show how to construct a possibility integrity constraint for D efficiently, i.e, in polynomial time in the size of the domain. We also show how to distinguish formulas that are possibility integrity constraints in linear time in the size of the input formula. Finally, we prove the analogous results for local possibility domains, i.e. domains that admit an aggregator which is not a projection function, even when restricted to any given issue. Our result falls in the realm of classical results that give syntactic characterizations of logical relations that have certain closure properties, like e.g. the result that logical relations component-wise closed under logical AND are precisely the models of Horn formulas. However, our techniques draw from results in judgment aggregation theory as well from results about propositional formulas and logical relations.

Cite as

Josep Díaz, Lefteris Kirousis, Sofia Kokonezi, and John Livieratos. Algorithmically Efficient Syntactic Characterization of Possibility Domains. In 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 132, pp. 50:1-50:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{diaz_et_al:LIPIcs.ICALP.2019.50,
  author =	{D{\'\i}az, Josep and Kirousis, Lefteris and Kokonezi, Sofia and Livieratos, John},
  title =	{{Algorithmically Efficient Syntactic Characterization of Possibility Domains}},
  booktitle =	{46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)},
  pages =	{50:1--50:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-109-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{132},
  editor =	{Baier, Christel and Chatzigiannakis, Ioannis and Flocchini, Paola and Leonardi, Stefano},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2019.50},
  URN =		{urn:nbn:de:0030-drops-106269},
  doi =		{10.4230/LIPIcs.ICALP.2019.50},
  annote =	{Keywords: collective decision making, computational social choice, judgment aggregation, logical relations, algorithm complexity}
}
Document
Absorption Time of the Moran Process

Authors: Josep Díaz, Leslie Ann Goldberg, David Richerby, and Maria Serna

Published in: LIPIcs, Volume 28, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2014)


Abstract
The Moran process models the spread of mutations in populations on graphs. We investigate the absorption time of the process, which is the time taken for a mutation introduced at a randomly chosen vertex to either spread to the whole population, or to become extinct. It is known that the expected absorption time for an advantageous mutation is polynomial on an n-vertex undirected graph, which allows the behaviour of the process on undirected graphs to be analysed using the Markov chain Monte Carlo method. We show that this does not extend to directed graphs by exhibiting an infinite family of directed graphs for which the expected absorption time is exponential in the number of vertices. However, for regular directed graphs, we give the expected absorption time is blog n lower bound and an explicit quadratic upper bound. We exhibit families of graphs matching these bounds and give improved bounds for other families of graphs, based on isoperimetric number. Our results are obtained via stochastic dominations which we demonstrate by establishing a coupling in a related continuous-time model. The coupling also implies several natural domination results regarding the fixation probability of the original (discrete-time) process, resolving a conjecture of Shakarian, Roos and Johnson.

Cite as

Josep Díaz, Leslie Ann Goldberg, David Richerby, and Maria Serna. Absorption Time of the Moran Process. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2014). Leibniz International Proceedings in Informatics (LIPIcs), Volume 28, pp. 630-642, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@InProceedings{diaz_et_al:LIPIcs.APPROX-RANDOM.2014.630,
  author =	{D{\'\i}az, Josep and Goldberg, Leslie Ann and Richerby, David and Serna, Maria},
  title =	{{Absorption Time of the Moran Process}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2014)},
  pages =	{630--642},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-74-3},
  ISSN =	{1868-8969},
  year =	{2014},
  volume =	{28},
  editor =	{Jansen, Klaus and Rolim, Jos\'{e} and Devanur, Nikhil R. and Moore, Cristopher},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX-RANDOM.2014.630},
  URN =		{urn:nbn:de:0030-drops-47279},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2014.630},
  annote =	{Keywords: Moran Process}
}
Document
A new upper bound for 3-SAT

Authors: Josep Diaz, Lefteris Kirousis, Dieter Mitsche, and Xavier Perez-Gimenez

Published in: LIPIcs, Volume 2, IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (2008)


Abstract
We show that a randomly chosen $3$-CNF formula over $n$ variables with clauses-to-variables ratio at least $4.4898$ is asymptotically almost surely unsatisfiable. The previous best such bound, due to Dubois in 1999, was $4.506$. The first such bound, independently discovered by many groups of researchers since 1983, was $5.19$. Several decreasing values between $5.19$ and $4.506$ were published in the years between. The probabilistic techniques we use for the proof are, we believe, of independent interest.

Cite as

Josep Diaz, Lefteris Kirousis, Dieter Mitsche, and Xavier Perez-Gimenez. A new upper bound for 3-SAT. In IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 2, pp. 163-174, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2008)


Copy BibTex To Clipboard

@InProceedings{diaz_et_al:LIPIcs.FSTTCS.2008.1750,
  author =	{Diaz, Josep and Kirousis, Lefteris and Mitsche, Dieter and Perez-Gimenez, Xavier},
  title =	{{A new upper bound for 3-SAT}},
  booktitle =	{IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science},
  pages =	{163--174},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-08-8},
  ISSN =	{1868-8969},
  year =	{2008},
  volume =	{2},
  editor =	{Hariharan, Ramesh and Mukund, Madhavan and Vinay, V},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2008.1750},
  URN =		{urn:nbn:de:0030-drops-17507},
  doi =		{10.4230/LIPIcs.FSTTCS.2008.1750},
  annote =	{Keywords: Satisfiability, 3-SAT, random, threshold}
}
  • Refine by Author
  • 3 Díaz, Josep
  • 2 Diaz, Josep
  • 2 Kirousis, Lefteris
  • 1 Accattoli, Beniamino
  • 1 Arrial, Victor
  • Show More...

  • Refine by Classification
  • 2 Theory of computation → Randomness, geometry and discrete structures
  • 1 Computing methodologies → Artificial intelligence
  • 1 Computing methodologies → Knowledge representation and reasoning
  • 1 Information systems → Information integration
  • 1 Theory of computation → Approximation algorithms analysis
  • Show More...

  • Refine by Keyword
  • 2 Lambda calculus
  • 1 3 dimensional sphere
  • 1 3-SAT
  • 1 Applications of logics
  • 1 Declarative representations
  • Show More...

  • Refine by Type
  • 9 document

  • Refine by Publication Year
  • 4 2024
  • 2 2019
  • 1 2008
  • 1 2014
  • 1 2022