Can a Content Management System Provide a Good User Experience to Teachers? (Short Paper)

Authors Yannik Bauer , José Paulo Leal , Ricardo Queirós



PDF
Thumbnail PDF

File

OASIcs.ICPEC.2023.4.pdf
  • Filesize: 0.7 MB
  • 8 pages

Document Identifiers

Author Details

Yannik Bauer
  • DCC - FCUP, Porto, Portugal
  • CRACS - INESC TEC, Porto, Portugal
José Paulo Leal
  • CRACS - INESC TEC, Porto, Portugal
  • DCC - FCUP, Porto, Portugal
Ricardo Queirós
  • CRACS - INESC TEC, Porto, Portugal
  • uniMAD - ESMAD, Polytechnic of Porto, Portugal

Cite As Get BibTex

Yannik Bauer, José Paulo Leal, and Ricardo Queirós. Can a Content Management System Provide a Good User Experience to Teachers? (Short Paper). In 4th International Computer Programming Education Conference (ICPEC 2023). Open Access Series in Informatics (OASIcs), Volume 112, pp. 4:1-4:8, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023) https://doi.org/10.4230/OASIcs.ICPEC.2023.4

Abstract

The paper discusses an ongoing project that aims to enhance the UX of teachers while using e-learning systems. Specifically, the project focuses on developing the teacher’s user interface (UI) for Agni, a web-based code playground for learning JavaScript. The goal is to design an intuitive UI with valuable features that will encourage more teachers to use the system. To achieve this goal, the paper explores the use of a headless Content Management System (CMS) called Strapi. The primary research question the paper seeks to answer is whether a headless CMS, specifically Strapi, can provide a good UX to teachers. A usability evaluation of the built-in Strapi UI for content creation and management reveals it to be generally consistent and user-friendly but challenging and unintuitive to create courses with programming exercises. As a result, the decision was made to develop a new teacher’s UI based on the existing Agni UI for students in an editable version. Once the development is complete, a new usability evaluation of the fully developed teacher’s UI will be conducted with the Strapi UI evaluation as a baseline for comparison.

Subject Classification

ACM Subject Classification
  • Applied computing → Interactive learning environments
Keywords
  • learning environment
  • programming exercises
  • programming learning
  • automatic assessment
  • headless CMS
  • CMS
  • user experience

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Allam Hassan Allam, Ab Razak Che Hussin, and Halina Mohamed Dahlan. User experience: challenges and opportunities. In Journal of Information Systems Research and Innovation 2013, 2013. Google Scholar
  2. D. Benyon. Designing User Experience. Pearson Educación, 2019. URL: https://books.google.pt/books?id=MXqFDwAAQBAJ.
  3. John Dunlosky, Katherine A. Rawson, Elizabeth J. Marsh, Mitchell J. Nathan, and Daniel T. Willingham. Improving students' learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychological Science in the Public Interest, 14(1):4-58, 2013. URL: http://www.jstor.org/stable/23484712.
  4. Marc Hassenzahl and Noam Tractinsky. User experience - A research agenda. Behaviour & Information Technology, 25(2):91-97, 2006. URL: https://doi.org/10.1080/01449290500330331.
  5. Bettina Laugwitz, Theo Held, and Martin Schrepp. Construction and evaluation of a user experience questionnaire. In HCI and Usability for Education and Work, volume 5298, pages 63-76, November 2008. URL: https://doi.org/10.1007/978-3-540-89350-9_6.
  6. Effie Lai-Chong Law, Virpi Roto, Marc Hassenzahl, Arnold P.O.S. Vermeeren, and Joke Kort. Understanding, scoping and defining user experience: A survey approach. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '09, pages 719-728, New York, NY, USA, 2009. Association for Computing Machinery. URL: https://doi.org/10.1145/1518701.1518813.
  7. Jakob Nielsen. Enhancing the explanatory power of usability heuristics. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '94, pages 152-158, New York, NY, USA, 1994. Association for Computing Machinery. URL: https://doi.org/10.1145/191666.191729.
  8. Jakob Nielsen. Usability inspection methods. In Conference Companion on Human Factors in Computing Systems, CHI '94, pages 413-414, New York, NY, USA, 1994. Association for Computing Machinery. URL: https://doi.org/10.1145/259963.260531.
  9. Jakob Nielsen. Why you only need to test with 5 users, March 2000. URL: https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/.
  10. Ricardo Queiros and José Leal. Programming exercises evaluation systems - An interoperability survey. In International Conference on Computer Supported Education, volume 1, pages 83-90, January 2012. Google Scholar
  11. Roshni Sabarinath and Choon Lang Gwendoline Quek. A case study investigating programming students’ peer review of codes and their perceptions of the online learning environment. Education and Information Technologies, 25(5):3553-3575, September 2020. URL: https://doi.org/10.1007/s10639-020-10111-9.
  12. Zarina Shukur, Edmund Burke, and Eric Foxley. The automatic assessment of formal specification coursework. Journal of Computing in Higher Education, 11(1):86, 1999. Google Scholar
  13. Leon E. Winslow. Programming pedagogy - A psychological overview. SIGCSE Bull., 28(3):17-22, September 1996. URL: https://doi.org/10.1145/234867.234872.
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail