Relational Knowledge Extraction from Attribute-Value Learners

Authors Manoel V. M. França, Artur S. D. Garcez, Gerson Zaverucha



PDF
Thumbnail PDF

File

OASIcs.ICCSW.2013.35.pdf
  • Filesize: 0.64 MB
  • 8 pages

Document Identifiers

Author Details

Manoel V. M. França
Artur S. D. Garcez
Gerson Zaverucha

Cite As Get BibTex

Manoel V. M. França, Artur S. D. Garcez, and Gerson Zaverucha. Relational Knowledge Extraction from Attribute-Value Learners. In 2013 Imperial College Computing Student Workshop. Open Access Series in Informatics (OASIcs), Volume 35, pp. 35-42, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013) https://doi.org/10.4230/OASIcs.ICCSW.2013.35

Abstract

Bottom Clause Propositionalization (BCP) is a recent propositionalization method which allows fast relational learning. Propositional learners can use BCP to obtain accuracy results comparable with Inductive Logic Programming (ILP) learners. However, differently from ILP learners, what has been learned cannot normally be represented in first-order logic. In this paper, we propose an approach and introduce a novel algorithm for extraction of first-order rules from propositional rule learners, when dealing with data propositionalized with BCP. A theorem then shows that the extracted first-order rules are consistent with their propositional version. The algorithm was evaluated using the rule learner RIPPER, although it can be applied on any propositional rule learner. Initial results show that the accuracies of both RIPPER and the extracted first-order rules can be comparable to those obtained by Aleph (a traditional ILP system), but our approach is considerably faster (obtaining speed-ups of over an order of magnitude), generating a compact rule set with at least the same representation power as standard ILP learners.

Subject Classification

Keywords
  • Relational Learning
  • Propositionalization
  • Knowledge Extraction

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail