Neural-Symbolic Temporal Decision Trees for Multivariate Time Series Classification

Authors Giovanni Pagliarini , Simone Scaboro , Giuseppe Serra , Guido Sciavicco , Ionel Eduard Stan



PDF
Thumbnail PDF

File

LIPIcs.TIME.2022.13.pdf
  • Filesize: 0.69 MB
  • 15 pages

Document Identifiers

Author Details

Giovanni Pagliarini
  • Department of Mathematics and Computer Science, University of Ferrara, Italy
  • Department of Mathematics, Physics, and Computer Science, University of Parma, Italy
Simone Scaboro
  • Department of Mathematics, Physics, and Computer Science, University of Udine, Italy
Giuseppe Serra
  • Department of Mathematics, Physics, and Computer Science, University of Udine, Italy
Guido Sciavicco
  • Department of Mathematics and Computer Science, University of Ferrara, Italy
Ionel Eduard Stan
  • Department of Mathematics and Computer Science, University of Ferrara, Italy
  • Department of Mathematics, Physics, and Computer Science, University of Parma, Italy

Cite AsGet BibTex

Giovanni Pagliarini, Simone Scaboro, Giuseppe Serra, Guido Sciavicco, and Ionel Eduard Stan. Neural-Symbolic Temporal Decision Trees for Multivariate Time Series Classification. In 29th International Symposium on Temporal Representation and Reasoning (TIME 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 247, pp. 13:1-13:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)
https://doi.org/10.4230/LIPIcs.TIME.2022.13

Abstract

Multivariate time series classification is a widely known problem, and its applications are ubiquitous. Due to their strong generalization capability, neural networks have been proven to be very powerful for the task, but their applicability is often limited by their intrinsic black-box nature. Recently, temporal decision trees have been shown to be a serious alternative to neural networks for the same task in terms of classification performances, while attaining higher levels of transparency and interpretability. In this work, we propose an initial approach to neural-symbolic temporal decision trees, that is, an hybrid method that leverages on both the ability of neural networks of capturing temporal patterns and the flexibility of temporal decision trees of taking decisions on intervals based on (possibly, externally computed) temporal features. While based on a proof-of-concept implementation, in our experiments on public datasets, neural-symbolic temporal decision trees show promising results.

Subject Classification

ACM Subject Classification
  • Theory of computation → Theory and algorithms for application domains
Keywords
  • Machine learning
  • neural-symbolic
  • temporal logic
  • hybrid temporal decision trees

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. S. Alaniz, D. Marcos, B. Schiele, and Z. Akata. Learning decision trees recurrently through communication. In Proc. of the Conference on Computer Vision and Pattern Recognition (CVPR), pages 13518-13527, 2021. Google Scholar
  2. L. Atlas, R. Cole, Y. Muthusamy, A. Lippman, J. Connor, D. Park, M. El-Sharkawai, and R.J. Marks. A performance comparison of trained multilayer perceptrons and trained classification trees. Proc. of the IEEE International Conference on Systems, Man and Cybernetics (SMC), 78(10):1614-1619, 1990. Google Scholar
  3. A. J. Bagnall, J. Lines, A. Bostrom, J. Large, and E. J. Keogh. The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances. Data Mining and Knowledge Discovery, 31(3):606-660, 2017. Google Scholar
  4. D. Bahdanau, K. Cho, and Y. Bengio. Neural machine translation by jointly learning to align and translate, 2014. URL: https://doi.org/10.48550/arXiv.1409.0473.
  5. L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone. Classification and regression trees. Wadsworth Publishing Company, 1984. Google Scholar
  6. R. P. Brent. Fast training algorithms for multilayer neural nets. IEEE Transactions on Neural Networks, 2(3):346-354, 1991. Google Scholar
  7. M. W. Craven and J. W. Shavlik. Extracting tree-structured representations of trained networks. In Proc. of the 8th Advances in Neural Information Processing Systems (NIPS), pages 24-30, 1995. URL: http://papers.nips.cc/paper/1152-extracting-tree-structured-representations-of-trained-networks.
  8. D. Dancey, D. McLean, and Z. Bandar. Decision tree extraction from trained neural networks. In Proc. of the 7th International Florida Artificial Intelligence Research Society Conference (FLAIRS), pages 515-519, 2004. Google Scholar
  9. A. S. d'Avila Garcez, M. Gori, L. C. Lamb, L. Serafini, M. Spranger, and S. N. Tran. Neural-symbolic computing: An effective methodology for principled integration of machine learning and reasoning. Journal of Applied Logics, 6(4):611-632, 2019. Google Scholar
  10. A. S. d'Avila Garcez, L. C. Lamb, and D. M. Gabbay. Neural-Symbolic Cognitive Reasoning. Cognitive Technologies. Springer, 2009. Google Scholar
  11. H. I. Fawaz, G. F., J. Weber, L. Idoumghar, and P.-A. Muller. Deep learning for time series classification: a review. Data Mining and Knowledge Discovery, 33(4):917-963, 2019. Google Scholar
  12. H. Guo and S. B. Gelfand. Classification trees with neural network feature extraction. IEEE Transactions on Neural Networks, 3(6):923-933, 1992. Google Scholar
  13. J. Y. Halpern and Y. Shoham. A propositional modal logic of time intervals. Journal of the ACM, 38(4):935-962, 1991. Google Scholar
  14. G. Hinton, O. Vinyals, and J. Dean. Distilling the knowledge in a neural network, 2015. URL: https://doi.org/10.48550/arXiv.1503.02531.
  15. I. Ivanova and M. Kubat. Initialization of neural networks by means of decision trees. Knowledge-Based Systems, 8(6):333-344, 1995. Google Scholar
  16. Seyed Mehran Kazemi, Rishab Goel, Sepehr Eghbali, Janahan Ramanan, Jaspreet Sahota, Sanjay Thakur, Stella Wu, Cathal Smyth, Pascal Poupart, and Marcus Brubaker. Time2vec: Learning a vector representation of time, 2019. URL: https://doi.org/10.48550/arXiv.1907.05321.
  17. P. Kontschieder, M. Fiterau, A. Criminisi, and S. Rota Bulò. Deep neural decision forests. In Proc. of the International Conference on Computer Vision (ICCV), pages 1467-1475, 2015. Google Scholar
  18. R. Krishnan, G. Sivakumar, and P. Bhattacharya. Extracting decision trees from trained neural networks. Pattern Recognition, 32(12):1999-2009, 1999. Google Scholar
  19. M. Kubat. Decision trees can initialize radial-basis function networks. IEEE Transactions on Neural Networks, 9(5):813-821, 1998. Google Scholar
  20. M. Längkvist, L. Karlsson, and A. Loutfi. A review of unsupervised feature learning and deep learning for time-series modeling. Pattern Recognition Letters, 42:11-24, 2014. Google Scholar
  21. T. Li, L. Fang, and A. Jennings. Structurally adaptive self-organizing neural trees. In Proc. of the International Joint Conference on Neural Networks (IJCNN), volume 3, pages 329-334, 1992. Google Scholar
  22. F. Manzella, G. Pagliarini, G. Sciavicco, and I. E. Stan. Interval temporal random forests with an application to COVID-19 diagnosis. In Proc. of the 28th International Symposium on Temporal Representation and Reasoning (TIME), volume 206 of LIPIcs, pages 7:1-7:18. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2021. Google Scholar
  23. C. Micheloni, A. Rani, S. Kumar, and G. L. Foresti. A balanced neural tree for pattern classification. Neural Networks, 27:81-90, 2012. Google Scholar
  24. M. Minsky. Logical versus analogical or symbolic versus connectionist or neat versus scruffy. AI Magazine, 12(2):34-51, 1991. Google Scholar
  25. C. Murdock, Z. Li, H. Zhou, and T. Duerig. Blockout: Dynamic model selection for hierarchical deep networks. In Proc. of the Conference on Computer Vision and Pattern Recognition (CVPR), pages 2583-2591, 2016. Google Scholar
  26. V. N. Murthy, V. Singh, T. Chen, R. Manmatha, and D. Comaniciu. Deep decision network for multi-class image classification. In Proc. of the Conference on Computer Vision and Pattern Recognition (CVPR), pages 2240-2248, 2016. Google Scholar
  27. G. Pagliarini, F. Manzella, G. Sciavicco, and I. E. Stan. ModalDecisionTrees.jl: Interpretable models for native time-series & image classification, 2021. URL: https://doi.org/10.5281/zenodo.7040419.
  28. J. R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, 1993. Google Scholar
  29. A. Pasos Ruiz, M. Flynn, J. Large, M. Middlehurst, and A. J. Bagnall. The great multivariate time series classification bake off: a review and experimental evaluation of recent algorithmic advances. Data Mining and Knowledge Discovery, 35(2):401-449, 2021. Google Scholar
  30. G. P. J. Schmitz, C. Aldrich, and F. S. Gouws. ANN-DT: An algorithm for extraction of decision trees from artificial neural networks. IEEE Transactions on Neural Networks, 10(6):1392-1401, 1999. Google Scholar
  31. G. Sciavicco and I. E. Stan. Knowledge extraction with interval temporal logic decision trees. In Proc. of the 27th International Symposium on Temporal Representation and Reasoning (TIME), volume 178 of LIPIcs, pages 9:1-9:16. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2020. Google Scholar
  32. I. K. Sethi. Entropy nets: From decision trees to neural networks. Proc. of the IEEE, 78(10):1605-1613, 1990. Google Scholar
  33. R. Setiono and W. K. Leow. On mapping decision trees and neural networks. Knowledge-Based Systems, 12(3):95-99, 1999. Google Scholar
  34. R. Setiono and H. Liu. A connectionist approach to generating oblique decision trees. IEEE Transactions on Systems, Man and Cybernetics - Part B, 29(3):440-444, 1999. Google Scholar
  35. J. W. Shavlik, R. J. Mooney, and G. G. Towell. Symbolic and neural learning algorithms: An experimental comparison. Machine Learning, 6:111-143, 1991. Google Scholar
  36. N. Srivastava and R. Salakhutdinov. Discriminative transfer learning with tree-based priors. In Proc. of the 26th Advances In Neural Information Processing Systems (NIPS), pages 2094-2102, 2013. Google Scholar
  37. I. Sutskever, O. Vinyals, and Q. V. Le. Sequence to sequence learning with neural networks. In Proc. of the 27th Advances in Neural Information Processing Systems (NIPS), pages 3104-3112, 2014. Google Scholar
  38. G. G. Towell and J. W. Shavlik. Extracting refined rules from knowledge-based neural networks. Machine Learning, 13:71-101, 1993. Google Scholar
  39. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin. Attention is all you need. In Proc. of the 30th Advances in Neural Information Processing Systems (NIPS), pages 5998-6008, 2017. Google Scholar
  40. A. Wan, L. Dunlap, D. Ho, J. Yin, S. Lee, S. Petryk, S. Adel Bargal, and J. E. Gonzalez. NBDT: Neural-Backed Decision Tree. In Proc. of the 9th International Conference on Learning Representations (ICLR), 2021. Google Scholar
  41. Z.-H. Zhou and Z. Chen. Hybrid decision tree. Knowledge-Based Systems, 15(8):515-528, 2002. Google Scholar
  42. Z.-H. Zhou and Y. Jiang. NeC4.5: Neural Ensemble Based C4.5. IEEE Transactions on Knowledge and Data Engineering, 16(6):770-773, 2004. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail