eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
0
0
10.4230/LIPIcs.ICLP.2010
article
LIPIcs, Volume 7, ICLP'10, Complete Volume
Hermenegildo, Manuel
Schaub, Torsten
LIPIcs, Volume 7, ICLP'10, Complete Volume
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010/LIPIcs.ICLP.2010.pdf
Logic Programming, Concurrent Programming, Distribution, Maintenance, and Enhancement, Language Classifications, Language Constructs and Features Software/ Program Verification, Models of Computation, Modes of Computation, Semantics of Programming Languages, Mathematical Logic
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
i
x
10.4230/LIPIcs.ICLP.2010.i
article
Titlepage, Table of Contents, Conference Organization
Hermenegildo, Manuel
Schaub, Torsten
Frontmatter including titlepage, table of contents and conference organization.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.i/LIPIcs.ICLP.2010.i.pdf
Titlepage
Table of Contents
Conference Organization
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
11
14
10.4230/LIPIcs.ICLP.2010.XI
article
Introduction to the Technical Communications of the 26th International Conference on Logic Programming
Hermenegildo, Manuel
Schaub, Torsten
Introduction to the Technical Communications of the 26th International Conference on Logic Programming.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.XI/LIPIcs.ICLP.2010.XI.pdf
Introduction
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
1
1
10.4230/LIPIcs.ICLP.2010.1
article
Datalog for Enterprise Software: from Industrial Applications to Research (Invited Talk)
Aref, Molham
LogicBlox is a platform for the rapid development of enterprise applications in the domains of decision automation, analytics, and planning. Although the LogicBlox platform embodies several components and technology decisions (e.g., an emphasis on software-as- a-service), the key substrate and glue is an implementation of the Datalog language. All application development on the LogicBlox platform is done declaratively in Datalog: The language is used to query large data sets, but also to develop web and desktop GUIs (with the help of pre-defined libraries), to interface with solvers, statistics tools, and optimizers for complex analytics solutions, and to express the overall business logic of the application.
The goal of this talk is to present both the business case for Datalog and the fruitful interaction of research and industrial applications in the LogicBlox context.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.1/LIPIcs.ICLP.2010.1.pdf
Datalog
decision automation
analytics
planning
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
2
3
10.4230/LIPIcs.ICLP.2010.2
article
A Logical Paradigm for Systems Biology (Invited Talk)
Fages, François
Biologists use diagrams to represent complex systems of interaction between molecular species. These graphical notations encompass two types of information: interactions (e.g. protein complexation, modification, binding to a gene, etc.) and regulations (of an interaction or a transcription). Based on these structures, mathematical models can be developed by equipping such molecular interaction networks with kinetic expressions leading to quantitative models of mainly two kinds: ordinary differential equations for a continuous interpretation of the kinetics and continuous-time Markov chains for a stochastic interpretation of the kinetics.
Since 2002, we investigate the transposition of programming concepts and tools to the analysis of living processes at the cellular level.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.2/LIPIcs.ICLP.2010.2.pdf
temporal logic
model-checking
systems biology
hybrid systems
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
4
13
10.4230/LIPIcs.ICLP.2010.4
article
Runtime Addition of Integrity Constraints in an Abductive Proof Procedure
Alberti, Marco
Gavanelli, Marco
Lamma, Evelina
Abductive Logic Programming is a computationally founded representation of abductive reasoning. In most ALP frameworks, integrity constraints express domainspecific logical relationships that abductive answers are required to satisfy.
Integrity constraints are usually known a priori. However, in some applications (such as interactive abductive logic programming, multi-agent interactions, contracting) it makes sense to relax this assumption, in order to let the abductive reasoning start with incomplete knowledge of integrity constraints, and to continue without restarting when new integrity constraints become known.
In this paper, we propose a declarative semantics for abductive logic programming with addition of integrity constraints during the abductive reasoning process, an operational instantiation (with formal termination, soundness and completeness properties) and an implementation of such a framework based on the SCIFF language and proof procedure.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.4/LIPIcs.ICLP.2010.4.pdf
Abduction
semantics
interactive computation
proof procedure
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
14
23
10.4230/LIPIcs.ICLP.2010.14
article
Learning Domain-Specific Heuristics for Answer Set Solvers
Balduccini, Marcello
In spite of the recent improvements in the performance of Answer Set Programming (ASP) solvers, when the search space is sufficiently large, it is still possible for the search algorithm to mistakenly focus on areas of the search space that contain no solutions or very few. When that happens, performance degrades substantially, even to the point that the solver may need to be terminated before returning an answer. This prospect is a concern when one is considering using such a solver in an industrial setting, where users typically expect consistent performance. To overcome this problem, in this paper we propose a technique that allows learning domain-specific heuristics for ASP solvers. The learning is done off-line, on representative instances from the target domain, and the learned heuristics are then used for choice-point selection. In our experiments, the introduction of domain-specific heuristics improved performance on hard instances by up to 3 orders of magnitude (and 2 on average), nearly completely eliminating the cases in which the solver had to be terminated because the wait for an answer had become unacceptable.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.14/LIPIcs.ICLP.2010.14.pdf
Answer set programming
solvers
domain-specific heuristics
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
24
33
10.4230/LIPIcs.ICLP.2010.24
article
HEX Programs with Action Atoms
Basol, Selen
Erdem, Ozan
Fink, Michael
Ianni, Giovambattista
HEX programs were originally introduced as a general framework for extending declarative logic programming, under the stable model semantics, with the possibility of bidirectionally accessing external sources of knowledge and/or computation. The original framework, however, does not deal satisfactorily with stateful external environments: the possibility of predictably influencing external environments has thus not yet been considered explicitly. This paper lifts HEX programs to ACTHEX programs: ACTHEX programs introduce the notion of action atoms, which are associated to corresponding functions capable of actually changing the state of external environments. The execution of specific sequences of action atoms can be declaratively programmed. Furthermore, ACTHEX programs allow for selecting preferred actions, building on weights and corresponding cost functions. We introduce syntax and semantics of acthex programs; ACTHEX programs can successfully be exploited as a general purpose language for the declarative implementation of executable specifications, which we illustrate by encodings of knowledge bases updates, action languages, and an agent programming language. A system capable of executing ACTHEX programs has been implemented and is publicly available.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.24/LIPIcs.ICLP.2010.24.pdf
Answer Set Programming
Logic programming interoperability
Action languages
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
34
43
10.4230/LIPIcs.ICLP.2010.34
article
Communicating Answer Set Programs
Bauters, Kim
Janssen, Jeroen
Schockaert, Steven
Vermeir, Dirk
De Cock, Martine
Answer set programming is a form of declarative programming that has proven very successful in succinctly formulating and solving complex problems. Although mechanisms for representing and reasoning with the combined answer set programs of multiple agents have already been proposed, the actual gain in expressivity when adding communication has not been thoroughly studied. We show that allowing simple programs to talk to each other results in the same expressivity as adding negation-as-failure. Furthermore, we show that the ability to focus on one program in a network of simple programs results in the same expressivity as adding disjunction in the head of the rules.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.34/LIPIcs.ICLP.2010.34.pdf
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
44
53
10.4230/LIPIcs.ICLP.2010.44
article
Implementation Alternatives for Bottom-Up Evaluation
Brass, Stefan
Bottom-up evaluation is a central part of query evaluation / program execution in deductive databases. It is used after a source code optimization like magic sets or SLDmagic that ensures that only facts relevant for the query can be derived. Then bottom-up evaluation simply performs the iteration of the standard TP -operator to compute the minimal model. However, there are different ways to implement bottom-up evaluation efficiently. Since this is most critical for the performance of a deductive database system, and since performance is critical for the acceptance of deductive database technology, this question deserves a thorough analysis. In this paper we start this work by discussing several different implementation alternatives. Especially, we propose a new implementation of bottom-up evaluation called "Push-Method".
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.44/LIPIcs.ICLP.2010.44.pdf
Deductive databases
bottom-up evaluation
implementation
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
54
63
10.4230/LIPIcs.ICLP.2010.54
article
Inductive Logic Programming as Abductive Search
Corapi, Domenico
Russo, Alessandra
Lupu, Emil
We present a novel approach to non-monotonic ILP and its implementation called TAL (Top-directed Abductive Learning). TAL overcomes some of the completeness problems of ILP systems based on Inverse Entailment and is the first top-down ILP system that allows background theories and hypotheses to be normal logic programs. The approach relies on mapping an ILP problem into an equivalent ALP one. This enables the use of established ALP proof procedures and the specification of richer language bias with integrity constraints. The mapping provides a principled search space for an ILP problem, over which an abductive search is used to compute inductive solutions.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.54/LIPIcs.ICLP.2010.54.pdf
Inductive Logic Programming
Abductive Logic Programming
Non-monotonic Reasoning
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
64
73
10.4230/LIPIcs.ICLP.2010.64
article
Efficient Solving of Time-dependent Answer Set Programs
Fayruzov, Timur
Janssen, Jeroen
Vermeir, Dirk
Cornelis, Chris
De Cock, Martine
Answer set programs with time predicates are useful to model systems whose properties depend on time, like for example gene regulatory networks. A state of such a system at time point t then corresponds to the literals of an answer set that are grounded with time constant t. An important task when modelling time-dependent systems is to find steady states from which the system's behaviour does not change anymore. This task is complicated by the fact that it is typically not known in advance at what time steps these steady states occur. A brute force approach of estimating a time upper bound tmax and grounding and solving the program w.r.t. that upper bound leads to a suboptimal solving time when the estimate is too low or too high. In this paper we propose a more efficient algorithm for solving Markovian programs, which are time-dependent programs for which the next state depends only on the previous state. Instead of solving these Markovian programs for a long time interval {0,...,tmax}, we successively find answer sets of parts of the grounded program. Our approach guarantees the discovery of all steady states and cycles while avoiding unnecessary extra work.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.64/LIPIcs.ICLP.2010.64.pdf
Answer set programming
time-dependent programs
gene regulation networks
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
74
83
10.4230/LIPIcs.ICLP.2010.74
article
Improving the Efficiency of Gibbs Sampling for Probabilistic Logical Models by Means of Program Specialization
Fierens, Daan
There is currently a large interest in probabilistic logical models. A popular algorithm for approximate probabilistic inference with such models is Gibbs sampling. From a computational perspective, Gibbs sampling boils down to repeatedly executing certain queries on a knowledge base composed of a static part and a dynamic part. The larger the static part, the more redundancy there is in these repeated calls. This is problematic since inefficient Gibbs sampling yields poor approximations.
We show how to apply program specialization to make Gibbs sampling more efficient. Concretely, we develop an algorithm that specializes the definitions of the query-predicates with respect to the static part of the knowledge base. In experiments on real-world benchmarks we obtain speedups of up to an order of magnitude.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.74/LIPIcs.ICLP.2010.74.pdf
Probabilistic logical models
probabilistic logic programming
program specialization
Gibbs sampling
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
84
93
10.4230/LIPIcs.ICLP.2010.84
article
Focused Proof Search for Linear Logic in the Calculus of Structures
Guenot, Nicolas
The proof-theoretic approach to logic programming has benefited from the introduction of focused proof systems, through the non-determinism reduction and control they provide when searching for proofs in the sequent calculus. However, this technique was not available in the calculus of structures, known for inducing even more non-determinism than other logical formalisms. This work in progress aims at translating the notion of focusing into the presentation of linear logic in this setting, and use some of its specific features, such as deep application of rules and fine granularity, in order to improve proof search procedures. The starting point for this research line is the multiplicative fragment of linear logic, for which a simple focused proof system can be built.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.84/LIPIcs.ICLP.2010.84.pdf
Proof theory
focusing
proof search
deep inference
linear logic
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
94
103
10.4230/LIPIcs.ICLP.2010.94
article
Sampler Programs: The Stable Model Semantics of Abstract Constraint Programs Revisited
Janhunen, Tomi
Abstract constraint atoms provide a general framework for the study of aggregates utilized in answer set programming. Such primitives suitably increase the expressive power of rules and enable more concise representation of various domains as answer set programs. However, it is non-trivial to generalize the stable model semantics for programs involving arbitrary abstract constraint atoms. For instance, a nondeterministic variant of the immediate consequence operator is needed, or the definition of stable models cannot be stated directly using primitives of logic programs. In this paper, we propose sampler programs as a relaxation of abstract constraint programs that better lend themselves to the program transformation involved in the definition of stable models. Consequently, the declarative nature of stable models can be restored for sampler programs and abstract constraint programs are also covered if decomposed into sampler programs. Moreover, we study the relationships of the classes of programs involved and provide a characterization in terms of abstract but essentially deterministic computations. This result indicates that all nondeterminism related with abstract constraint atoms can be resolved at the level of program reduct when sampler programs are used as the intermediate representation.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.94/LIPIcs.ICLP.2010.94.pdf
Stable models
abstract constraints
program reduction
translation
choice rules
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
104
113
10.4230/LIPIcs.ICLP.2010.104
article
A Framework for Verification and Debugging of Resource Usage Properties: Resource Usage Verification
Lopez-Garcia, Pedro
Darmawan, Luthfi
Bueno, Francisco
We present a framework for (static) verification of general resource usage program properties. The framework extends the criteria of correctness as the conformance of a program to a specification expressing non-functional global properties, such as upper and lower bounds on execution time, memory, energy, or user defined resources, given as functions on input data sizes. A given specification can include both lower and upper bound resource usage functions, i.e., it can express intervals where the resource usage is supposed to be included in. We have defined an abstract semantics for resource usage properties and operations to compare the (approximated) intended semantics of a program (i.e., the specification) with approximated semantics inferred by static analysis. These operations include the comparison of arithmetic functions (e.g., polynomial, exponential or logarithmic functions). A novel aspect of our framework is that the static checking of assertions generates answers that include conditions under which a given specification can be proved or disproved. For example, these conditions can express intervals of input data sizes such that a given specification can be proved for some intervals but disproved for others. We have implemented our techniques within the Ciao/CiaoPP system in a natural way, so that the novel resource usage verification blends in with the CiaoPP framework that unifies static verification and static debugging (as well as run-time verification and unit testing).
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.104/LIPIcs.ICLP.2010.104.pdf
Program Verification and Debugging
Cost Analysis
Resource Usage Analysis
Complexity Analysis
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
114
123
10.4230/LIPIcs.ICLP.2010.114
article
Contractibility and Contractible Approximations of Soft Global Constraints
Maher, Michael J.
We study contractibility and its approximation for two very general classes of soft global constraints. We introduce a general formulation of decomposition-based soft constraints and provide a sufficient condition for contractibility and an approach to approximation. For edit-based soft constraints, we establish that the tightest contractible approximation cannot be expressed in edit-based terms, in general.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.114/LIPIcs.ICLP.2010.114.pdf
Constraint logic programming
global constraints
open constraints
soft constraints
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
124
133
10.4230/LIPIcs.ICLP.2010.124
article
Dedicated Tabling for a Probabilistic Setting
Mantadelis, Theofrastos
Janssens, Gerda
ProbLog is a probabilistic framework that extends Prolog with probabilistic facts. To compute the probability of a query, the complete SLD proof tree of the query is collected as a sum of products. ProbLog applies advanced techniques to make this feasible and to assess the correct probability. Tabling is a well-known technique to avoid repeated subcomputations and to terminate loops. We investigate how tabling can be used in ProbLog. The challenge is that we have to reconcile tabling with the advanced ProbLog techniques. While standard tabling collects only the answers for the calls, we do need the SLD proof tree. Finally we discuss how to deal with loops in our probabilistic framework. By avoiding repeated subcomputations, our tabling approach not only improves the execution time of ProbLog programs, but also decreases accordingly the memory consumption. We obtain promising results for ProbLog programs using exact probability inference.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.124/LIPIcs.ICLP.2010.124.pdf
Tabling
Loop Detection
Probabilistic Logical Programming
ProbLog
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
134
143
10.4230/LIPIcs.ICLP.2010.134
article
Tight Semantics for Logic Programs
Pereira, Luis Moniz
Pinto, Alexandre Miguel
We define the Tight Semantics (TS), a new semantics for all NLPs complying with the requirements of: 2-valued semantics; preserving the models of SM; guarantee of model existence, even in face of Odd Loops Over Negation (OLONs) or infinite chains; relevance cumulativity; and compliance with the Well-Founded Model.
When complete models are unnecessary, and top-down querying (à la Prolog) is desired, TS provides the 2-valued option that guarantees model existence, as a result of its relevance property. Top-down querying with abduction by need is rendered available too by TS. The user need not pay the price of computing whole models, nor that of generating all possible abductions, only to filter irrelevant ones subsequently.
A TS model of a NLP P is any minimal model M of P that further satisfies P^---the program remainder of P---in that each loop in P^ has a MM contained in M, whilst respecting the constraints imposed by the MMs of the other loops so-constrained too.
The applications afforded by TS are all those of Stable Models, which it generalizes, plus those permitting to solve OLONs for model existence, plus those employing OLONs for productively obtaining problem solutions, not just filtering them (like Integrity Constraints).
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.134/LIPIcs.ICLP.2010.134.pdf
Normal Logic Programs
Relevance
Cumulativity
Stable Models
Well-Founded Semantics
Program Remainder
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
144
153
10.4230/LIPIcs.ICLP.2010.144
article
From Relational Specifications to Logic Programs
Near, Joseph P.
This paper presents a compiler from expressive, relational specifications to logic programs. Specifically, the compiler translates the Imperative Alloy specification language to Prolog. Imperative Alloy is a declarative, relational specification language based on first-order logic and extended with imperative constructs; Alloy specifications are traditionally not executable. In spite of this theoretical limitation, the compiler produces useful prototype implementations for many specifications.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.144/LIPIcs.ICLP.2010.144.pdf
logic programming
specification languages
executable specifications
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
154
161
10.4230/LIPIcs.ICLP.2010.154
article
Methods and Methodologies for Developing Answer-Set Programs - Project Description
Oetsch, Johannes
Pührer, Jörg
Tompits, Hans
Answer-set programming (ASP) is a well-known formalism for declarative problem solving, enjoying a continuously increasing number of diverse applications. However, arguably one of the main challenges for a wider acceptance of ASP is the need of tools, methods, and methodologies that support the actual programming process. In this paper, we review the main goals of a project, funded by the Austrian Science Fund (FWF), which aims to address this aspect in a systematic manner. The project is planned for a duration of three years and started in September 2009. Generally, the focus of research will be on methodologies for systematic program development, program testing, and debugging. In particular, in working on these areas, special emphasis shall be given to the ability of the developed techniques to respect the declarative nature of ASP. To support a sufficient level of usability, solutions are planned to be compatible not only for the core language of ASP but also for important extensions thereof that are commonly used and realised in various answer-set solvers. Ultimately, the methods resulting from the project shall form the basis of an integrated development environment (IDE) for ASP that is envisaged to combine straightforward as well as advanced techniques, realising a convenient tool for developing answer-set programs.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.154/LIPIcs.ICLP.2010.154.pdf
Answer-set programming
program development
testing
debugging
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
162
171
10.4230/LIPIcs.ICLP.2010.162
article
Tabling and Answer Subsumption for Reasoning on Logic Programs with Annotated Disjunctions
Riguzzi, Fabrizio
Swift, Terrance
The paper presents the algorithm "Probabilistic Inference with Tabling and Answer subsumption" (PITA) for computing the probability of queries from Logic Programs with Annotated Disjunctions. PITA is based on a program transformation techniques that adds an extra argument to every atom. PITA uses tabling for saving intermediate results and answer subsumption for combining different answers for the same subgoal. PITA has been implemented in XSB and compared with the ProbLog, cplint and CVE systems. The results show that in almost all cases, PITA is able to solve larger problems and is faster than competing algorithms.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.162/LIPIcs.ICLP.2010.162.pdf
Probabilistic Logic Programming
Tabling
Answer Subsumption
Logic Programs with Annotated Disjunction
Program Transformation
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
172
181
10.4230/LIPIcs.ICLP.2010.172
article
Subsumer: A Prolog theta-subsumption engine
Santos, Jose
Muggleton, Stephen
State-of-the-art theta-subsumption engines like Django (C) and Resumer2 (Java) are implemented in imperative languages. Since theta-subsumption is inherently a logic problem, in this paper we explore how to efficiently implement it in Prolog.
theta-subsumption is an important problem in computational logic and particularly relevant to the Inductive Logic Programming (ILP) community as it is at the core of the hypotheses coverage test which is often the bottleneck of an ILP system. Also, since most of those systems are implemented in Prolog, they can immediately take advantage of a Prolog based theta-subsumption engine.
We present a relatively simple (~1000 lines in Prolog) but efficient and general theta-subsumption engine, Subsumer. Crucial to Subsumer's performance is the dynamic and recursive decomposition of a clause in sets of independent components. Also important are ideas borrowed from constraint programming that empower Subsumer to efficiently work on clauses with up to several thousand literals and several dozen distinct variables.
Using the notoriously challenging Phase Transition dataset we show that, cputime wise, Subsumer clearly outperforms the Django subsumption engine and is competitive with the more sophisticated, state-of-the-art, Resumer2. Furthermore, Subsumer's memory requirements are only a small fraction of those engines and it can handle arbitrary Prolog clauses whereas Django and Resumer2 can only handle Datalog clauses.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.172/LIPIcs.ICLP.2010.172.pdf
Theta-subsumption
Prolog
Inductive Logic Programming
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
182
191
10.4230/LIPIcs.ICLP.2010.182
article
Using Generalized Annotated Programs to Solve Social Network Optimization Problems
Shakarian, Paulo
Subrahmanian, V.S.
Sapino, Maria Luisa
Reasoning about social networks (labeled, directed, weighted graphs) is becoming increasingly important and there are now models of how certain phenomena (e.g. adoption of products/services by consumers, spread of a given disease) "diffuse" through the network. Some of these diffusion models can be expressed via generalized annotated programs (GAPs). In this paper, we consider the following problem: suppose we have a given goal to achieve (e.g. maximize the expected number of adoptees of a product or minimize the spread of a disease) and suppose we have limited resources to use in trying to achieve the goal (e.g. give out a few free plans, provide medication to key people in the SN) - how should these resources be used so that we optimize a given objective function related to the goal? We define a class of social network optimization problems (SNOPs) that supports this type of reasoning. We formalize and study the complexity of SNOPs and show how they can be used in conjunction with existing economic and disease diffusion models.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.182/LIPIcs.ICLP.2010.182.pdf
Annotated logic programming
optimization queries
social networks
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
192
201
10.4230/LIPIcs.ICLP.2010.192
article
Abductive Inference in Probabilistic Logic Programs
Simari, Gerardo
Subrahmanian, V.S.
Action-probabilistic logic programs (ap-programs) are a class of probabilistic logic programs that have been extensively used during the last few years for modeling behaviors of entities. Rules in ap-programs have the form "If the environment in which entity E operates satisfies certain conditions, then the probability that E will take some action A is between L and U". Given an ap-program, we are interested in trying to change the environment, subject to some constraints, so that the probability that entity E takes some action (or combination of actions) is maximized. This is called the Basic Probabilistic Logic Abduction Problem (Basic PLAP). We first formally define and study the complexity of Basic PLAP and then provide an exact (exponential) algorithm to solve it, followed by more efficient algorithms for specific subclasses of the problem. We also develop appropriate heuristics to solve Basic PLAP efficiently.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.192/LIPIcs.ICLP.2010.192.pdf
Probabilistic Logic Programming
Imprecise Probabilities
Abductive Inference
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
202
211
10.4230/LIPIcs.ICLP.2010.202
article
Circumscription and Projection as Primitives of Logic Programming
Wernhard, Christoph
KWe pursue a representation of logic programs as classical first-order sentences. Different semantics for logic programs can then be expressed by the way in which they are wrapped into - semantically defined - operators for circumscription and projection. (Projection is a generalization of second-order quantification.) We demonstrate this for the stable model semantics, Clark's completion and a three-valued semantics based on the Fitting operator. To represent the latter, we utilize the polarity sensitiveness of projection, in contrast to second-order quantification, and a variant of circumscription that allows to express predicate minimization in parallel with maximization. In accord with the aim of an integrated view on different logic-based representation techniques, the material is worked out on the basis of first-order logic with a Herbrand semantics.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.202/LIPIcs.ICLP.2010.202.pdf
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
212
221
10.4230/LIPIcs.ICLP.2010.212
article
Timed Definite Clause Omega-Grammars
Saeedloei, Neda
Gupta, Gopal
We propose timed context-free grammars (TCFGs) and show how parsers for such grammars can be developed using definite clause grammars (DCGs) coupled with constraints over reals (CLP(R)). Timed context-free grammars describe timed context-free languages (TCFLs). We next extend timed context-free grammars to timed context-free omega-grammars (omega-TCFGs for brevity) and incorporate co-inductive logic programming in DCGs to obtain parsers for them. Timed context-free omega-grammars describe timed context-free languages containing infinite-sized words, and are a generalization of timed omega-regular languages recognized by timed automata. We show a practical application of omega-TCFGs to the well-known generalized railroad crossing problem.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.212/LIPIcs.ICLP.2010.212.pdf
Constraint Logic Programming over reals
Co-induction
Context-Free Grammars
Omega-Grammars
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
222
225
10.4230/LIPIcs.ICLP.2010.222
article
Towards a Parallel Virtual Machine for Functional Logic Programming
Alqaddoumi, Abdulla
Functional logic programming is a multi-paradigm programming that combines the best features of functional programming and logic programming. Functional programming provides mechanisms for demand-driven evaluation, higher order functions and polymorphic typing. Logic programming deals with non-determinism, partial information and constraints. Both programming paradigms fall under the umbrella of declarative programming. For the most part, the current implementations of functional logic languages belong to one of two categories: (1) Implementations that include the logic programming features in a functional language. (2) Implementations that extend logic languages with functional programming features. In this paper we describe the undergoing research efforts to build a parallel virtual machine that performs functional logic computations. The virtual machine will tackle several issues that other implementations do not tackle: (1) Sharing of sub-terms among different terms especially when such sub-terms are evaluated to more than one value (non-determinism). (2) Exploitation of all forms of parallelism present in computations. The evaluation strategy used to evaluate functional logic terms is needed narrowing, which is a complete and sound strategy.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.222/LIPIcs.ICLP.2010.222.pdf
Functional logic programming
term rewriting system
non-determinism
needed narrowing
and-parallelism
or-parallelism
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
226
235
10.4230/LIPIcs.ICLP.2010.226
article
Dynamic Magic Sets for Disjunctive Datalog Programs
Alviano, Mario
Answer set programming (ASP) is a powerful formalism for knowledge representation and common sense reasoning that allows disjunction in rule heads and nonmonotonic negation in bodies. Magic Sets are a technique for optimizing query answering over logic programs and have been originally defined for standard Datalog, that is, ASP without disjunction and negation. Essentially, the input program is rewritten in order to identify a subset of the program instantiation which is sufficient for answering the query.
Dynamic Magic Sets (DMS) are an extension of this technique to ASP. The optimization provided by DMS can be exploited also during the nondeterministic phase of ASP systems. In particular, after some assumptions have been made during the computation, parts of the program may become irrelevant to a query (because of these assumptions). This allows for dynamic pruning of the search space, which may result in exponential performance gains.
DMS has been implemented in the dlv system and experimental results confirm the effectiveness of the technique.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.226/LIPIcs.ICLP.2010.226.pdf
Answer set programming
decidability
magic sets
disjunctive logic programs
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
236
240
10.4230/LIPIcs.ICLP.2010.236
article
Bisimilarity in Concurrent Constraint Programming
Aristizabal P., Andres A.
In this doctoral work we aim at developing a new approach to labelled semantics and equivalences for the Concurrent Constraint Programming (CCP) which will enable a broader capture of processes’ behavioural equivalence. Moreover, we work towards exploiting the strong connection between first order logic and CCP. Something which will allow us to represent logical formulae in terms of CCP processes and verify its logical equivalence by means of our notion of bisimilarity. Finally, following the lines of the Concurrecy Workbench we plan to implement a CCP Workbench based on our theoretical structure.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.236/LIPIcs.ICLP.2010.236.pdf
Concurrent Constraint Programming
Concurrency
Behavioural Equivalence
Bisimilarity
Process Calculi
Operational Semantics
Labelled Semantics
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
241
247
10.4230/LIPIcs.ICLP.2010.241
article
Program analysis for code duplication in logic programs
Dandois, Celine
In this PhD project, we deal with the issue of code duplication in logic programs. In particular semantical duplication or redundancy is generally viewed as a possible seed of inconvenience in all phases of the program lifecycle, from development to maintenance. The core of this research is the elaboration of a theory of semantical duplication, and of an automated program analysis capable of detecting such duplication and which could steer, to some extent, automatic refactoring of program code.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.241/LIPIcs.ICLP.2010.241.pdf
Logic programming
program comprehension
static program analysis
code duplication
code clone
software engineering
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
248
254
10.4230/LIPIcs.ICLP.2010.248
article
Program Analysis to Support Concurrent Programming in Declarative Languages
Demeyer, Romain
In recent years, manufacturers of processors are focusing on parallel architectures in order to increase performance.
This shift in hardware evolution is provoking a fundamental turn towards concurrency in software development.
Unfortunately, developing concurrent programs which are correct and efficient is hard, as the underlying programming model is much more complex than it is for simple sequential programs.
The goal of this research is to study and to develop program analysis to support and improve concurrent software development in declarative languages. The characteristics of these languages offer opportunities, as they are good candidates for building concurrent applications while their simple and uniform data representation, together with a small and formally defined semantics makes them well-adapted to automatic program analysis techniques. In our work, we focus primarily on developing static analysis techniques for detecting race conditions at the application level in Mercury and Prolog programs. A further step is to derive (semi-) automatically the location and the granularity of the critical sections using a data-centric approach.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.248/LIPIcs.ICLP.2010.248.pdf
Program Analysis -- Concurrent Programming -- Logic Languages -- Abstract Interpretation
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
255
264
10.4230/LIPIcs.ICLP.2010.255
article
Constraint Answer Set Programming Systems
Drescher, Christian
We present an integration of answer set programming and constraint processing as an interesting approach to constraint logic programming.
Although our research is in a very early stage, we motivate constraint answer set programming and report on related work, our research objectives, preliminary results we achieved, and future work.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.255/LIPIcs.ICLP.2010.255.pdf
Answer set programming
constraint logic programming
constraint processing
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
265
269
10.4230/LIPIcs.ICLP.2010.265
article
Towards a General Argumentation System based on Answer-Set Programming
Gaggl, Sarah Alice
Within the last years, especially since the work proposed by Dung in 1995, argumentation has emerged as a central issue in Artificial Intelligence. With the so called argumentation frameworks (AFs) it is possible to represent statements (arguments) together with a binary attack relation between them.
The conflicts between the statements are solved on a semantical level by selecting acceptable sets of arguments. An increasing amount of data requires an automated computation of such solutions.
Logic Programming in particular Answer-Set Programming (ASP) turned out to be adequate to solve problems associated to such AFs.
In this work we use ASP to design a sophisticated system for the evaluation of several types of argumentation frameworks.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.265/LIPIcs.ICLP.2010.265.pdf
Argumentation
Implementation
Answer-Set Programming
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
270
276
10.4230/LIPIcs.ICLP.2010.270
article
Models for Trustworthy Service and Process Oriented Systems
Lopez, Hugo A.
Service and process-oriented systems promise to provide more effective business and work processes and more flexible and adaptable enterprise IT systems. However, the technologies and standards are still young and unstable, making research in their theoretical foundations increasingly important.
Our studies focus on two dichotomies: the global/local views of service interactions, and their imperative/declarative specification.
A global view of service interactions describes a process as a protocol for interactions, as e.g. an UML sequence diagram or a WS-CDL choreography. A local view describes the system as a set of processes, e.g. specified as a mipi-calculus or WS-BPEL process, implementing each participant in the process. While the global view is what is usually provided as specification, the local view is a necessary step towards a distributed implementation.
If processes are defined imperatively, the control flow is defined explicitly, e.g. as a sequence or flow graph of interactions/commands. In a declarative approach processes are described as a collection of conditions they should fulfill in order to be considered correct. The two approaches have evolved rather independently from each other. Our thesis is that we can provide a theoretical framework based on typed concurrent process and concurrent constraint calculi for the specification, analysis and verification of service and process oriented system designs which bridges the global and local view and combines the imperative and declarative specification approaches, and can be employed to increase the trust in the developed systems.
This article describes our main motivations, results and future research directions.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.270/LIPIcs.ICLP.2010.270.pdf
Concurrent Constraint Calculi
Session Types
Logic
Service and Process oriented computing
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
277
280
10.4230/LIPIcs.ICLP.2010.277
article
Design and Implementation of a Concurrent Logic Programming Language with Linear Logic Constraints
Martinez, Thierry
My thesis aims at designing a practical language as close as possible to the linear concurrent constraint (LCC) theory. The main contribution is a new operational semantics which behaves as an angelic scheduler with a tractable algorithmic complexity. This operational semantics is sound and complete with respect to the logical semantics and allows the construction of a rich language over a very simple kernel.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.277/LIPIcs.ICLP.2010.277.pdf
Concurrent logic programming
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
281
285
10.4230/LIPIcs.ICLP.2010.281
article
Higher-order Logic Learning and lambda-Progol
Pahlavi, Niels
We present our research produced about Higher-order Logic Learning (HOLL), which consists of adapting First-order Logic Learning (FOLL), like Inductive Logic Programming (ILP), within a Higher-order Logic (HOL) context. We describe a first working implementation of lambda-Progol, a HOLL system adapting the ILP system Progol and the HOL formalism lambda-Prolog. We compare lambda-Progol and Progol on the learning of recursive theories showing that HOLL can, in these cases, outperform FOLL.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.281/LIPIcs.ICLP.2010.281.pdf
Inductive Logic Programming
Progol
Higher-order Logic
Higher-order Logic Learning
$lambda$Prolog
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
286
288
10.4230/LIPIcs.ICLP.2010.286
article
Local Branching in a Constraint Programming Framework
Parisini, Fabio
We argue that integrating local branching in CP merges the advantages of the intensification and diversification mechanisms specific to local search methods, with constraint propagation that speeds up the neighborhood exploration by removing infeasible variable value assignments.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.286/LIPIcs.ICLP.2010.286.pdf
Local Branching
LDS
Local Search
Tree Search
Constraint Programming
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
289
293
10.4230/LIPIcs.ICLP.2010.289
article
Logic Programming Foundations of Cyber-Physical Systems
Saeedloei, Neda
Cyber-physical systems (CPS) are becoming ubiquitous. Almost every device today has a controller that reads inputs through sensors, does some processing and then performs actions through actuators. These controllers are discrete digital systems whose inputs are continuous physical quantities and whose outputs control physical (analog) devices. Thus, CPS involve both digital and analog data. In addition, CPS are assumed to run forever, and many CPS may run concurrently with each other. we will develop techniques for faithfully and elegantly modeling CPS. Our approach is based on using constraint logic programming over reals, co-induction, and coroutining.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.289/LIPIcs.ICLP.2010.289.pdf
Cyber-Physical Systems
Constraint Logic Programming over reals
Co-induction
Coroutining
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
294
299
10.4230/LIPIcs.ICLP.2010.294
article
Realizing the Dependently Typed Lambda Calculus
Snow, Zachary
Dependently typed lambda calculi such as the Edinburgh Logical Framework (LF) can encode relationships between terms in types and can naturally capture correspondences between formulas and their proofs. Such calculi can also be given a logic programming interpretation: the system is based on such an interpretation of LF. We have considered whether a conventional logic programming language can also provide the benefits of a Twelf-like system for encoding type and term dependencies through dependent typing, and whether it can do so in an efficient manner.
In particular, we have developed a simple mapping from LF specifications to a set of formulas in the higher-order hereditary Harrop (hohh) language, that relates derivations and proof-search between the two frameworks. We have shown that this encoding can be improved by exploiting knowledge of the well-formedness of the original LF specifications to elide much redundant type-checking information. The resulting logic program has a structure that closely follows the original specification, thereby allowing LF specifications to be viewed as meta-programs that generate hohh programs. We have proven that this mapping is correct, and, using the Teyjus implementation of lprolog, we have shown that our translation provides an efficient means for executing LF specifications, complementing the ability the Twelf system provides for reasoning about them. In addition, the translation offers new avenues for reasoning about such specifications, via reasoning over the generated hohh programs.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.294/LIPIcs.ICLP.2010.294.pdf
Logical frameworks
logic programming
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
300
302
10.4230/LIPIcs.ICLP.2010.300
article
Structured Interactive Musical Scores
Toro-Bermudez, Mauricio
Interactive Scores is a formalism for the design and performance of interactive scenarios that provides temporal relations (TRs) among the objects of the scenario. We can model TRs among objects in Time Stream Petri nets, but it is difficult to represent global constraints. This can be done explicitly in the Non-deterministic Timed Concurrent Constraint (ntcc) calculus. We want to formalize a heterogeneous system that controls in one subsystem the concurrent execution of the objects using ntcc, and audio and video processing in the other. We also plan to develop an automatic verifier for ntcc.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.300/LIPIcs.ICLP.2010.300.pdf
Ntcc
ccp
interactive scores
temporal relations
faust
ntccrt
heterogeneous systems
automatic verification
eng
Schloss Dagstuhl – Leibniz-Zentrum für Informatik
Leibniz International Proceedings in Informatics
1868-8969
2010-06-25
7
303
305
10.4230/LIPIcs.ICLP.2010.303
article
Cutting-Edge Timing Analysis Techniques
Zwirchmayr, Jakob
This text gives an overview about my current research in timing analysis at the Vienna University of Technology. After a short introduction to the topic follows the description of an approach relying on CLP, the implicit path enumeration technique (IPET). This technique is also used in a tool developed at the institute of Computer Languages (TuBound). Current timing analysis tools suffer from a few flaws worth further investigation in order to achieve better results than current state-of-the-art timing analysis tools.
https://drops.dagstuhl.de/storage/00lipics/lipics-vol007-iclp2010/LIPIcs.ICLP.2010.303/LIPIcs.ICLP.2010.303.pdf
Verification
timing analysis
hard real-time systems
static analysis
worst-case execution time
loop-invariants
nested loop
symbolic computation,