Using Property-Based Testing to Generate Feedback for C Programming Exercises

Authors Pedro Vasconcelos , Rita P. Ribeiro



PDF
Thumbnail PDF

File

OASIcs.ICPEC.2020.28.pdf
  • Filesize: 402 kB
  • 10 pages

Document Identifiers

Author Details

Pedro Vasconcelos
  • Computer Science Department, Faculty of Science, University of Porto, Portugal
  • LIACC, Porto, Portugal
Rita P. Ribeiro
  • Computer Science Department, Faculty of Science, University of Porto, Portugal
  • LIAAD-INESC TEC, Porto, Portugal

Cite As Get BibTex

Pedro Vasconcelos and Rita P. Ribeiro. Using Property-Based Testing to Generate Feedback for C Programming Exercises. In First International Computer Programming Education Conference (ICPEC 2020). Open Access Series in Informatics (OASIcs), Volume 81, pp. 28:1-28:10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020) https://doi.org/10.4230/OASIcs.ICPEC.2020.28

Abstract

This paper reports on the use of property-based testing for providing feedback to C programming exercises. Test cases are generated automatically from properties specified in a test script; this not only makes it possible to conduct many tests (thus potentially find more mistakes), but also allows simplifying failed tests cases automatically.
We present some experimental validation gathered for an introductory C programming course during the fall semester of 2018 that show significant positive correlations between getting feedback during the semester and the student’s results in the final exam. We also discuss some limitations regarding feedback for undefined behaviors in the C language.

Subject Classification

ACM Subject Classification
  • Social and professional topics → Student assessment
  • Software and its engineering → Software testing and debugging
  • Software and its engineering → Domain specific languages
Keywords
  • property-based testing
  • C language
  • Haskell language
  • teaching programming

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. QuviQ AB. Erlang QuickCheck. http://www.quviq.com/products/erlang-quickcheck/, 2018. [Online; accessed April 2020]. URL: http://www.quviq.com/products/erlang-quickcheck/.
  2. Alireza Ahadi, Raymond Lister, and Arto Vihavainen. On the number of attempts students made on some online programming exercises during semester and their subsequent performance on final exam questions. In Proc. of the 2016 ACM Conf. on Innovation and Technology in Computer Science Education, ITiCSE '16, pages 218-223. ACM, 2016. Google Scholar
  3. Thomas Arts, John Hughes, Joakim Johansson, and Ulf Wiger. Testing telecoms software with Quviq QuickCheck. In Proc. of the 2006 ACM SIGPLAN Workshop on Erlang, ERLANG '06, pages 2-10. ACM, 2006. Google Scholar
  4. Clara Benac Earle, Lars-Åke Fredlund, and John Hughes. Automatic grading of programming exercises using property-based testing. In Proc. of the 2016 ACM Conf. on Innovation and Technology in Computer Science Education, ITiCSE '16, pages 47-52. ACM, 2016. Google Scholar
  5. Koen Claessen and John Hughes. Quickcheck: A lightweight tool for random testing of haskell programs. In Proc. of the Fifth ACM SIGPLAN International Conf. on Functional Programming, ICFP '00, pages 268-279. ACM, 2000. Google Scholar
  6. Crispin Cowan, Perry Wagle, Calton Pu, Steve Beattie, and Jonathan Walpole. Buffer overflows: attacks and defenses for the vulnerability of the decade. Foundations of Intrusion Tolerant Systems, pages 227-237, 2003. Google Scholar
  7. Gene Fisher and Corrigan Johnson. Making formal methods more relevant to software engineering students via automated test generation. In Proc. of the 2016 ACM Conf. on Innovation and Technology in Computer Science Education, ITiCSE '16, pages 224-229. ACM, 2016. Google Scholar
  8. Jianxiong Gao, Bei Pang, and Steven S. Lumetta. Automated feedback framework for introductory programming courses. In Proc. of the 2016 ACM Conf. on Innovation and Technology in Computer Science Education, ITiCSE '16, pages 53-58. ACM, 2016. Google Scholar
  9. John Hughes. Experiences with QuickCheck: Testing the Hard Stuff and Staying Sane, pages 169-186. Springer International Publishing, 2016. Google Scholar
  10. Hieke Keuning, Johan Jeuring, and Bastiaan Heeren. Towards a systematic review of automated feedback generation for programming exercises. In Proc. of the 2016 ACM Conf. on Innovation and Technology in Computer Science Education, ITiCSE '16, pages 41-46. ACM, 2016. Google Scholar
  11. David R. MacIver. Hypothesis. https://hypothesis.readthedocs.io/, 2018. [Online; accessed April 2020]. URL: https://hypothesis.readthedocs.io/.
  12. Rickard Nilsson. Scalacheck: Property-based testing for Scala. https://www.scalacheck.org/, 2018. [Online; accessed April 2020]. URL: https://www.scalacheck.org/.
  13. R Core Team. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, 2019. URL: https://www.R-project.org/.
  14. Frank Yates. Contingency tables involving small number and the χ² test. Supplement to the Journal of the Royal Statistical Society, 1(2):217-235, 1934. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail