Abstract
Purpose – Delivering high-quality, timely and formative feedback for students’ code-based coursework submissions is a problem faced by computer science (CS) educators. Automated feedback systems (AFSs) can provide immediate feedback on students’ work, without requiring them to be physically present in the classroom — an increasingly important consideration for education in the context of COVID-19 lockdowns. There are concerns, however, surrounding the quality of the feedback provided by existing AFSs, with many systems simply presenting a score, a binary classification (pass/fail), or a basic error identification (‘The program could not run.’). Such feedback, with little guidance on how to rectify any problems, raises doubts about whether these systems can stimulate deep engagement with the related knowledge or learning activities. This paper presents experience in developing and deploying a new AFS that attempts to address the current deficiencies identified.
Design/methodology/approach – We developed an AFS to mark and provide feedback to 160 CS students studying an introductory Databases class. The experience of designing, deploying, and evolving the AFS is examined through reflective practice, and focus-group (involving peer teachers) analysis. The student experience of the AFS is explored through formal university-level feedback systems, and a follow-up survey and interviews.
Findings – In contrast to most introductory-level coursework feedback and marking, which typically generate significant student reaction and requests for change, our AFS deployment resulted in zero grade-challenges. There were also no identified marking errors, or suggested inconsistencies or unfairness. Student feedback on the AFS was universally positive, with comments indicating an AFS-related increase in student motivation. Interesting perspectives that emerged from our reflections and analysis included the issues of how much impact our own software engineering training and approach to building and deploying the AFS had on this success.
Originality/value/implications – Our successful experience of building and using an AFS will be of interest to the entire teaching community, not just CS/SE educators. The associated increases in marking and feedback reproducibility, accountability, and automation represent an important advance in AFS technology. In collaboration with our students, we are currently evolving the AFS into an autonomous learning object that they will be able to use independently of regular classes. Eventually, we hope to release the AFS to a wider audience as an open educational resource (OER).
Design/methodology/approach – We developed an AFS to mark and provide feedback to 160 CS students studying an introductory Databases class. The experience of designing, deploying, and evolving the AFS is examined through reflective practice, and focus-group (involving peer teachers) analysis. The student experience of the AFS is explored through formal university-level feedback systems, and a follow-up survey and interviews.
Findings – In contrast to most introductory-level coursework feedback and marking, which typically generate significant student reaction and requests for change, our AFS deployment resulted in zero grade-challenges. There were also no identified marking errors, or suggested inconsistencies or unfairness. Student feedback on the AFS was universally positive, with comments indicating an AFS-related increase in student motivation. Interesting perspectives that emerged from our reflections and analysis included the issues of how much impact our own software engineering training and approach to building and deploying the AFS had on this success.
Originality/value/implications – Our successful experience of building and using an AFS will be of interest to the entire teaching community, not just CS/SE educators. The associated increases in marking and feedback reproducibility, accountability, and automation represent an important advance in AFS technology. In collaboration with our students, we are currently evolving the AFS into an autonomous learning object that they will be able to use independently of regular classes. Eventually, we hope to release the AFS to a wider audience as an open educational resource (OER).
Original language | English |
---|---|
Title of host publication | 2021 International Conference on Open and Innovative Education (ICOIE 2021) |
Editors | Eva Tsang, Kam Cheong Li, Philips Wang |
Place of Publication | Hong Kong |
Publisher | The Open University of Hong Kong |
Pages | 268-277 |
Number of pages | 10 |
ISBN (Electronic) | 9789888439683 |
Publication status | Published - 2021 |
Event | International Conference on Open and Innovative Education - Open University of Hong Kong, Hong Kong, China Duration: 5 Jul 2021 → … http://icoie2021.ouhk.edu.hk/ |
Conference
Conference | International Conference on Open and Innovative Education |
---|---|
Abbreviated title | ICOIE |
Country/Territory | China |
City | Hong Kong |
Period | 5/07/21 → … |
Internet address |
Keywords
- Automated feedback systems
- student engagement
- technology-enabled student advising
- Open Education Resource (OER)