An articulated approach to the development and evaluation of automated feedback for online MCQ quizzes in Human Biology
dc.contributor.author | Meyer, J. | |
dc.contributor.author | Fyfe, Susan | |
dc.contributor.author | Fyfe, Georgina | |
dc.contributor.author | Ziman, M. | |
dc.contributor.author | Plastow, K. | |
dc.contributor.author | Sanders, K. | |
dc.contributor.author | Hill, Julie | |
dc.contributor.editor | K Placing | |
dc.date.accessioned | 2017-01-30T14:48:08Z | |
dc.date.available | 2017-01-30T14:48:08Z | |
dc.date.created | 2009-03-05T00:55:47Z | |
dc.date.issued | 2007 | |
dc.identifier.citation | Meyer, Jan and Fyfe, Susan and Fyfe, Georgina and Ziman, Mel and Plastow, Kayty and Sanders, Kathy and Hill, Julie. 2007. An articulated approach to the development and evaluation of automated feedback for online MCQ quizzes in Human Biology, in K Placing (ed), UniServe Science: Science Teaching and Learning Research including Threshold Concepts, Sep 27 2007, pp. 52-57. University of Sydney: UniServe Science. | |
dc.identifier.uri | http://hdl.handle.net/20.500.11937/41092 | |
dc.description.abstract |
This paper describes an articulated programme of development and evaluation of automatically-presented explanatory feedback comments for online, enriched-multiple choice style quizzes in Human Biology for first year university courses. The degree of articulation of the separate components of the programme arose almost unintentionally from the inclusion of common sets of demographic questions in several of the components of the work, and fromcontinuity of logon identities, but proved to be a powerful means of reaching an understanding of the dynamics of student engagement with the online learning process and of the effectiveness of the product we were testing. In particular, links were established between expectations of academic performance and the amount of paid employment in which students were engaged, and between expected and achieved levels of performance. Students who expected lower levels ofperformance at the outset were also less convinced of the potential of feedback to help them with their studies. Analysis of the patterns of use of the online test revealed a serious disadvantage to working students of current accessibility to online summative assessments, and that the standard duration of the summative tests was approximately three times the preferred online work span of the younger students. ‘Dose’ and ‘decay’-graded selective improvements in end of semester assessments in the topics covered by the feedback comments could be demonstrated. | |
dc.publisher | UniServe Science | |
dc.title | An articulated approach to the development and evaluation of automated feedback for online MCQ quizzes in Human Biology | |
dc.type | Conference Paper | |
dcterms.source.startPage | 52 | |
dcterms.source.endPage | 57 | |
dcterms.source.title | Proceedings of the Assessment in Science Teaching and Learning Symposium | |
dcterms.source.series | Proceedings of the Assessment in Science Teaching and Learning Symposium | |
dcterms.source.isbn | 978-1-74210-005-0 | |
dcterms.source.conference | UniServe Science: Science Teaching and Learning Research including Threshold Concepts | |
dcterms.source.conference-start-date | Sep 27 2007 | |
dcterms.source.conferencelocation | University of Sydney | |
dcterms.source.place | The University of Sydney, New South Wales | |
curtin.accessStatus | Fulltext not available | |
curtin.faculty | Faculty of Health Sciences | |
curtin.faculty | School of Public Health |