An Experimental Analysis of GAME: A Generic Automated Marking Environment
MetadataShow full item record
This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness of the program's output. Currently, the system is able to mark programs written in Java, C++ and the C language. To use the system, instructors are required to provide a simple "marking schema" for any given assessment item, which includes pertinent information such as the location of files and the model solution. In this research, GAME has been tested on a number of student programming exercises and assignments. The results obtained, have been analysed and compared against a human marker providing encouraging results.
Proceedings of the 9th Annual SIGCSE Conference on Innovation and Technology in Computer Science Education