Can I Code? User Experience of an Assessment Platform for Programming Assignments (Short Paper)

Authors Anne Münzner , Nadja Bruckmoser, Alexander Meschtscherjakov



PDF
Thumbnail PDF

File

OASIcs.ICPEC.2021.18.pdf
  • Filesize: 0.9 MB
  • 12 pages

Document Identifiers

Author Details

Anne Münzner
  • Center for Human-Computer Interaction, University of Salzburg, Austria
Nadja Bruckmoser
  • University of Salzburg, Austria
Alexander Meschtscherjakov
  • Center for Human-Computer Interaction, University of Salzburg, Austria

Cite AsGet BibTex

Anne Münzner, Nadja Bruckmoser, and Alexander Meschtscherjakov. Can I Code? User Experience of an Assessment Platform for Programming Assignments (Short Paper). In Second International Computer Programming Education Conference (ICPEC 2021). Open Access Series in Informatics (OASIcs), Volume 91, pp. 18:1-18:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)
https://doi.org/10.4230/OASIcs.ICPEC.2021.18

Abstract

Learning a programming language is a difficult matter with numerous obstacles for university students - but also for their lecturers. Assessment tools for programming assignments can support both groups in this process. They shall be adapted to the needs of beginners and inexperienced students, but also be helpful in long-term use. We utilised an adapted version of the Artemis system as an assessment platform for first-year computer science students in the introductory programming course. To examine the students' user experience (UX) over the semester, we conducted a three-stage online questionnaire study (N=42). We found that UX evolves over the semester and that platform requirements and problems in its usage change over time. Our results show that newcomers need to be addressed with caution in the first weeks of the semester to overcome hurdles. Challenges shall be added as the semester progresses.

Subject Classification

ACM Subject Classification
  • Human-centered computing
Keywords
  • Programming tool
  • user experience
  • student evaluation
  • programming assignment

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Christine Bruce, Lawrence Buckingham, John Hynd, Camille McMahon, Mike Roggenkamp, and Ian Stoodley. The fear factor: How it affects students learning to program in a tertiary environment. Journal of Information Technology Education: Research, 9, January 2010. URL: https://doi.org/10.28945/1183.
  2. Marílio Cardoso, António Vieira de Castro, Álvaro Rocha, Emanuel Silva, and Jorge Mendonça. Use of Automatic Code Assessment Tools in the Programming Teaching Process. In Ricardo Queirós, Filipe Portela, Mário Pinto, and Alberto Simões, editors, First International Computer Programming Education Conference (ICPEC 2020), volume 81 of OpenAccess Series in Informatics (OASIcs), pages 4:1-4:10, Dagstuhl, Germany, 2020. Schloss Dagstuhl-Leibniz-Zentrum für Informatik. URL: https://doi.org/10.4230/OASIcs.ICPEC.2020.4.
  3. Sapna Cheryan, Allison Master, and Andrew Meltzoff. Cultural stereotypes as gatekeepers: Increasing girls’ interest in computer science and engineering by diversifying stereotypes. Frontiers in psychology, 6:49, February 2015. URL: https://doi.org/10.3389/fpsyg.2015.00049.
  4. Pieter Desmet and Marc Hassenzahl. Towards Happiness: Possibility-Driven Design, pages 3-27. Springer Berlin Heidelberg, Berlin, Heidelberg, 2012. URL: https://doi.org/10.1007/978-3-642-25691-2_1.
  5. Sarah Diefenbach and Marc Hassenzahl. Psychologie in der nutzerzentrierten Produktgestaltung. Springer-Verlag Berlin Heidelberg, January 2017. URL: https://doi.org/10.1007/978-3-662-53026-9.
  6. Susan Farrell. Open-Ended vs. Closed-Ended Questions in User Research, 2016. URL: https://www.nngroup.com/articles/open-ended-questions/.
  7. Anabela Gomes and António José Nunes Mendes. Learning to program-difficulties and solutions. International Conference on Engineering Education, 2007. Google Scholar
  8. Tony Jenkins. ON THE DIFFICULTY OF LEARNING TO PROGRAM. In 3rd Annual LTSN-ICS Conference,Loughborough University, 2002. Google Scholar
  9. Evangelos Karapanos, Marc Hassenzahl, and Jean-Bernard Martens. User experience over time. In Proceeding of the twenty-sixth annual CHI conference extended abstracts on Human factors in computing systems - CHI '08, page 3561, New York, New York, USA, 2008. ACM Press. URL: https://doi.org/10.1145/1358628.1358891.
  10. Stephan Krusche and Andreas Seitz. ArTEMiS - An automatic assessment management system for interactive learning. In SIGCSE 2018 - Proceedings of the 49th ACM Technical Symposium on Computer Science Education, 2018. URL: https://doi.org/10.1145/3159450.3159602.
  11. Stephan Krusche and Andreas Seitz. Increasing the Interactivity in Software Engineering MOOCs - A Case Study. In Proceedings of the 52nd Hawaii International Conference on System Sciences, 2019. URL: https://doi.org/10.24251/hicss.2019.915.
  12. Sari Kujala, Marlene Vogel, Anna E. Pohlmeyer, and Marianna Obrist. Lost in time: The meaning of temporal aspects in user experience. In CHI '13 Extended Abstracts on Human Factors in Computing Systems, CHI EA '13, page 559–564, New York, NY, USA, 2013. Association for Computing Machinery. URL: https://doi.org/10.1145/2468356.2468455.
  13. Essi Lahtinen, Kirsti Ala-Mutka, and Hannu-Matti Järvinen. A study of the difficulties of novice programmers. In Proceedings of the 10th Annual SIGCSE Conference on Innovation and Technology in Computer Science Education, ITiCSE '05, page 14–18, New York, NY, USA, 2005. Association for Computing Machinery. URL: https://doi.org/10.1145/1067445.1067453.
  14. Bettina Laugwitz, Martin Schrepp, and Theo Held. Konstruktion eines Fragebogens zur Messung der User Experience von Softwareprodukten. In Mensch und Computer 2006, pages 125-134. OLDENBOURG WISSENSCHAFTSVERLAG, München, 2006. URL: https://doi.org/10.1524/9783486841749.125.
  15. Valerie Mendoza and David G. Novick. Usability over time. In Proceedings of the 23rd annual international conference on Design of communication documenting & designing for pervasive information - SIGDOC '05, page 151, New York, New York, USA, 2005. ACM Press. URL: https://doi.org/10.1145/1085313.1085348.
  16. Ricardo Queirós, Mário Pinto, and Teresa Terroso. Computer Programming Education in Portuguese Universities. OpenAccess Series in Informatics, 81(21):1-11, 2020. URL: https://doi.org/10.4230/OASIcs.ICPEC.2020.21.
  17. Maria Rauschenberger, Martin Schrepp, and Jörg Thomaschewski. User experience mit fragebögen messen - durchführung und auswertung am beispiel des ueq. In In Usability Professionals Konferenz 2013, September 2013. Google Scholar
  18. Heike Sandkühler, Martin Schrepp, and Jörg Thomaschewski. Ux messung mithilfe des ueq+ frameworks. In Christian Hansen, Andreas Nürnberger, and Bernhard Preim, editors, Mensch und Computer 2020 - Workshopband, Bonn, 2020. Gesellschaft für Informatik e.V. URL: https://doi.org/10.18420/muc2020-ws105-244.
  19. Draylson M. Souza, Katia R. Felizardo, and Ellen F. Barbosa. A Systematic Literature Review of Assessment Tools for Programming Assignments. In 2016 IEEE 29th International Conference on Software Engineering Education and Training (CSEET), pages 147-156. IEEE, April 2016. URL: https://doi.org/10.1109/CSEET.2016.48.
  20. Anne Venables and Liz Haywood. Programming students need instant feedback! In Proceedings of the Fifth Australasian Conference on Computing Education - Volume 20, ACE '03, page 267–272, AUS, 2003. Australian Computer Society, Inc. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail