Performance analysis of GAME: A generic automated marking environment
MetadataShow full item record
This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness of the program's output. Currently, the system is able to mark programs written in Java, C++ and the C language. To use the system, instructors are required to provide a simple "marking schema" for each given assessment item, which includes pertinent information such as the location of files and the model solution. In this research, GAME has been tested on a number of student programming exercises and assignments and its performance has been compared against that of a human marker. An in-depth statistical analysis of the comparison is presented, providing encouraging results and directions for employing GAME as a tool for teaching and learning.
Computers and Education
© 2008 Elsevier. This is the author-manuscript version of this paper. Reproduced in accordance with the copyright policy of the publisher. Please refer to the journal's website for access to the definitive, published version.
Science, Technology and Engineering Curriculum and Pedagogy