License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/OASIcs.ICLP.2018.6
URN: urn:nbn:de:0030-drops-98722
URL: https://drops.dagstuhl.de/opus/volltexte/2018/9872/
Go to the corresponding OASIcs Volume Portal


Côrte-Real, Joana ; Dries, Anton ; Dutra, Inês ; Rocha, Ricardo

Improving Candidate Quality of Probabilistic Logic Models

pdf-format:
OASIcs-ICLP-2018-6.pdf (0.5 MB)


Abstract

Many real-world phenomena exhibit both relational structure and uncertainty. Probabilistic Inductive Logic Programming (PILP) uses Inductive Logic Programming (ILP) extended with probabilistic facts to produce meaningful and interpretable models for real-world phenomena. This merge between First Order Logic (FOL) theories and uncertainty makes PILP a very adequate tool for knowledge representation and extraction. However, this flexibility is coupled with a problem (inherited from ILP) of exponential search space growth and so, often, only a subset of all possible models is explored due to limited resources. Furthermore, the probabilistic evaluation of FOL theories, coming from the underlying probabilistic logic language and its solver, is also computationally demanding. This work introduces a prediction-based pruning strategy, which can reduce the search space based on the probabilistic evaluation of models, and a safe pruning criterion, which guarantees that the optimal model is not pruned away, as well as two alternative more aggressive criteria that do not provide this guarantee. Experiments performed using three benchmarks from different areas show that prediction pruning is effective in (i) maintaining predictive accuracy for all criteria and experimental settings; (ii) reducing the execution time when using some of the more aggressive criteria, compared to using no pruning; and (iii) selecting better candidate models in limited resource settings, also when compared to using no pruning.

BibTeX - Entry

@InProceedings{crtereal_et_al:OASIcs:2018:9872,
  author =	{Joana C{\^o}rte-Real and Anton Dries and Inês Dutra and Ricardo Rocha},
  title =	{{Improving Candidate Quality of Probabilistic Logic Models}},
  booktitle =	{Technical Communications of the 34th International  Conference on Logic Programming (ICLP 2018)},
  pages =	{6:1--6:14},
  series =	{OpenAccess Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-090-3},
  ISSN =	{2190-6807},
  year =	{2018},
  volume =	{64},
  editor =	{Alessandro Dal Palu' and Paul Tarau and Neda Saeedloei and Paul Fodor},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2018/9872},
  URN =		{urn:nbn:de:0030-drops-98722},
  doi =		{10.4230/OASIcs.ICLP.2018.6},
  annote =	{Keywords: Relational Machine Learning, Probabilistic Inductive Logic Programming, Search Space Pruning, Model Quality, Experiments}
}

Keywords: Relational Machine Learning, Probabilistic Inductive Logic Programming, Search Space Pruning, Model Quality, Experiments
Collection: Technical Communications of the 34th International Conference on Logic Programming (ICLP 2018)
Issue Date: 2018
Date of publication: 19.11.2018


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI