Towards an Automated Test Bench Environment for Prolog Systems

Authors Ricardo Gonçalves, Miguel Areias, Ricardo Rocha



PDF
Thumbnail PDF

File

OASIcs.SLATE.2017.2.pdf
  • Filesize: 0.68 MB
  • 13 pages

Document Identifiers

Author Details

Ricardo Gonçalves
Miguel Areias
Ricardo Rocha

Cite AsGet BibTex

Ricardo Gonçalves, Miguel Areias, and Ricardo Rocha. Towards an Automated Test Bench Environment for Prolog Systems. In 6th Symposium on Languages, Applications and Technologies (SLATE 2017). Open Access Series in Informatics (OASIcs), Volume 56, pp. 2:1-2:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)
https://doi.org/10.4230/OASIcs.SLATE.2017.2

Abstract

Software testing and benchmarking is a key component of the software development process. Nowadays, a good practice in big software projects is the Continuous Integration (CI) software development technique. The key idea of CI is to let developers integrate their work as they produce it, instead of doing the integration at the end of each software module. In this paper, we extend a previous work on a benchmark suite for the Yap Prolog system and we propose a fully automated test bench environment for Prolog systems, named Yet Another Prolog Test Bench Environment (YAPTBE), aimed to assist developers in the development and CI of Prolog systems. YAPTBE is based on a cloud computing architecture and relies on the Jenkins framework and in a set of new Jenkins plugins to manage the underneath infrastructure. We present the key design and implementation aspects of YAPTBE and show its most important features, such as its graphical user interface and the automated process that builds and runs Prolog systems and benchmarks.
Keywords
  • Software Engineering
  • Program Correctness
  • Benchmarking
  • Prolog

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Miguel Areias and Ricardo Rocha. On combining linear-based strategies for tabled evaluation of logic programs. Journal of Theory and Practice of Logic Programming, (Special Issue, International Conference on Logic Programming), 11(4-5):681-696, July 2011. Google Scholar
  2. Miguel Areias and Ricardo Rocha. On extending a linear tabling framework to support batched scheduling. In Alberto Simões, Ricardo Queirós, and Daniela da Cruz, editors, Symposium on Languages, Applications and Technologies (SLATE 2012), pages 9-24, June 2012. Google Scholar
  3. Miguel Areias and Ricardo Rocha. Towards multi-threaded local tabling using a common table space. Journal of Theory and Practice of Logic Programming, (Special Issue, International Conference on Logic Programming), 12(4-5):427-443, September 2012. Google Scholar
  4. Miguel Areias and Ricardo Rocha. On scaling dynamic programming problems with a multithreaded tabling system. Journal of Systems and Software, 125:417-426, 2017. Google Scholar
  5. Roberto Bagnara. China - A Data-Flow Analyzer for CLP Languages. Available: http://www.cs.unipr.it/China/ (accessed April 2017).
  6. Klaus Bothe. A Prolog space benchmark suite: A new tool to compare Prolog implementations. SIGPLAN Notices, 25(12):54-60, 1990. Google Scholar
  7. Weidong Chen and David S. Warren. Tabled evaluation with delaying for general logic programs. Journal of the ACM, 43(1):20-74, January 1996. Google Scholar
  8. Alain Colmerauer, Henry Kanoui, Robert Pasero, and Philippe Roussel. Un système de communication homme-machine en francais. Technical report, Groupe Intelligence Artificielle, Université Aix-Marseille II, 1973. Google Scholar
  9. Paul Duvall, Stephen M. Matyas, and Andrew Glover. Continuous Integration: Improving Software Quality and Reducing Risk. Addison-Wesley Professional, 2007. Google Scholar
  10. Ralph Haygood. A Prolog benchmark suite for aquarius. Technical report, University of California at Berkeley, 1989. Google Scholar
  11. ISO/IEC 13211-1:1995: Information technology - Programming languages - Prolog - Part 1: General core, 1995. Google Scholar
  12. Senlin Liang, Paul Fodor, Hui Wan, and Michael Kifer. OpenRuleBench: An analysis of the performance of rule engines. In Internacional World Wide Web Conference, pages 601-610. ACM, April 2009. Google Scholar
  13. John Wylie Lloyd. Foundations of Logic Programming. Springer, 1987. Google Scholar
  14. Paulo Moura. ISO/IEC DTR 13211-5:2007 Prolog Multi-threading Predicates, 2008. URL: http://logtalk.org/plstd/threads.pdf.
  15. Glenford J. Myers, Corey Sandler, and Tom Badgett. The Art of Software Testing. Wiley Publishing, 3rd edition, 2011. Google Scholar
  16. Konstantinos Sagonas and Terrance Swift. An abstract machine for tabled execution of fixed-order stratified logic programs. ACM Transactions on Programming Languages and Systems, 20(3):586-634, May 1998. Google Scholar
  17. Vítor Santos Costa, Ricardo Rocha, and Luís Damas. The YAP Prolog system. Journal of Theory and Practice of Logic Programming, 12(1 & 2):5-34, 2012. Google Scholar
  18. John Ferguson Smart. Jenkins: The Definitive Guide. O'Reilly Media, Inc., 2011. Google Scholar
  19. David H. D. Warren. An abstract Prolog instruction set. Technical Note 309, SRI International, 1983. Google Scholar
  20. Jan Wielemaker and Vítor Santos Costa. Portability of Prolog programs: theory and case-studies. CoRR, abs/1009.3796, 2010. URL: http://arxiv.org/abs/1009.3796.
  21. Jan Wielemaker and Vítor Santos Costa. On the portability of Prolog applications. In Ricardo Rocha and John Launchbury, editors, 13th International Symposium on Practical Aspects of Declarative Languages (PADL2011), pages 69-83. Springer Berlin Heidelberg, 2011. Google Scholar
  22. Jan Wielemakers. Swi-prolog version 7 extensions. In International Joint Workshop on Implementation of Constraint and Logic Programming Systems and Logic-based Methods in Programming Environments, pages 109-123, 2014. Google Scholar
  23. Neng-Fa Zhou, Yi-Dong Shen, Li-Yan Yuan, and Jia-Huai You. Implementation of a Linear Tabling Mechanism. In Practical Aspects of Declarative Languages, number 1753 in LNCS, pages 109-123. Springer, 2000. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail