Knowledge Extraction with Interval Temporal Logic Decision Trees

Authors Guido Sciavicco , Ionel Eduard Stan



PDF
Thumbnail PDF

File

LIPIcs.TIME.2020.9.pdf
  • Filesize: 497 kB
  • 16 pages

Document Identifiers

Author Details

Guido Sciavicco
  • Department of Mathematics and Computer Science, University of Ferrara, Italy
Ionel Eduard Stan
  • Department of Mathematics and Computer Science, University of Ferrara, Italy
  • Department of Mathematical, Physical, and Computer Sciences, University of Parma, Italy

Acknowledgements

Computational resources have been offered by the University of Udine, Italy, supported by the PRID project Efforts in the uNderstanding of Complex interActing SystEms (ENCASE) and the authors acknowledge the partial support by the Italian INDAM GNCS project Strategic Reasoning and Automated Synthesis of Multi-Agent Systems.

Cite AsGet BibTex

Guido Sciavicco and Ionel Eduard Stan. Knowledge Extraction with Interval Temporal Logic Decision Trees. In 27th International Symposium on Temporal Representation and Reasoning (TIME 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 178, pp. 9:1-9:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)
https://doi.org/10.4230/LIPIcs.TIME.2020.9

Abstract

Multivariate temporal, or time, series classification is, in a way, the temporal generalization of (numeric) classification, as every instance is described by multiple time series instead of multiple values. Symbolic classification is the machine learning strategy to extract explicit knowledge from a data set, and the problem of symbolic classification of multivariate temporal series requires the design, implementation, and test of ad-hoc machine learning algorithms, such as, for example, algorithms for the extraction of temporal versions of decision trees. One of the most well-known algorithms for decision tree extraction from categorical data is Quinlan’s ID3, which was later extended to deal with numerical attributes, resulting in an algorithm known as C4.5, and implemented in many open-sources data mining libraries, including the so-called Weka, which features an implementation of C4.5 called J48. ID3 was recently generalized to deal with temporal data in form of timelines, which can be seen as discrete (categorical) versions of multivariate time series, and such a generalization, based on the interval temporal logic HS, is known as Temporal ID3. In this paper we introduce Temporal C4.5, that allows the extraction of temporal decision trees from undiscretized multivariate time series, describe its implementation, called Temporal J48, and discuss the outcome of a set of experiments with the latter on a collection of public data sets, comparing the results with those obtained by other, classical, multivariate time series classification methods.

Subject Classification

ACM Subject Classification
  • Theory of computation
  • Theory of computation → Logic
Keywords
  • Interval Temporal Logic
  • Decision Trees
  • Explainable AI
  • Time series

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. J. F. Allen. Maintaining knowledge about temporal intervals. Communications of the ACM, 26(11):832-843, 1983. Google Scholar
  2. A. Bagnall, H.A. Dau, J. Lines, M. Flynn, J. Large, A. Bostrom, P. Southam, and E. Keogh. The UEA multivariate time series classification archive, 2018. URL: http://arxiv.org/abs/1811.00075.
  3. A. Bagnall, J. Lines, A. Bostrom, J. Large, and E. Keogh. The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances. Data Mining and Knowledge Discovery, 31(3):606-660, 2017. Google Scholar
  4. A. Brunello, E. Marzano, A. Montanari, and G. Sciavicco. J48SS: A novel decision tree approach for the handling of sequential and time series data. Computers, 8(1):21, 2019. Google Scholar
  5. A. Brunello, G. Sciavicco, and I.E. Stan. Interval temporal logic decision tree learning. In Proc. of the 16th European Conference on Logics in Artificial Intelligence (JELIA), volume 11468 of Lecture Notes in Computer Science, pages 778-793. Springer, 2019. Google Scholar
  6. T.M. Cover and P.E. Hart. Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 13(1):21-27, 1967. Google Scholar
  7. B.D. Fulcher and N.S. Jones. Highly comparative feature-based time-series classification. IEEE Transactions on Knowledge and Data Engineering, 26(12):3026-3037, 2014. Google Scholar
  8. J.Y. Halpern and Y. Shoham. A propositional modal logic of time intervals. Journal of the ACM, 38(4):935-962, 1991. Google Scholar
  9. L. Hyafil and R.L. Rivest. Constructing optimal binary decision trees is NP-complete. Information Processing Letters, 5(1):15-17, 1976. Google Scholar
  10. J.R. Quinlan. Induction of decision trees. Machine Learning, 1:81-106, 1986. Google Scholar
  11. J.R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, 1993. Google Scholar
  12. J.R. Quinlan. Simplifying decision trees. International Journal of Human-Computer Studies, 51(2):497-510, 1999. Google Scholar
  13. G. Sciavicco, I.E. Stan, and A. Vaccari. Towards a general method for logical rule extraction from time series. In Proc. of the 8th International Work-Conference on the Interplay Between Natural and Artificial Computation (IWINAC), volume 11487 of Lecture Notes in Computer Science, pages 3-12. Springer, 2019. Google Scholar
  14. M. Shokoohi-Yekta, J. Wang, and E. Keogh. On the non-trivial generalization of dynamic time warping to the multi-dimensional case. In Proc. of the 15th SIAM International Conference on Data Mining, pages 289-297, 2015. Google Scholar
  15. I.H. Witten, E. Frank, and M.A. Hall. Data Mining: Practical Machine Learning Tools and Techniques. Morgan Kaufmann, 4th edition, 2017. Google Scholar
  16. Junfeng Wu, Li Hua Yao, and Bin Liu. An overview on feature-based classification algorithms for multivariate time series. In Proc. of the 3rd IEEE International Conference on Cloud Computing and Big Data Analysis, pages 32-38, 2018. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail