Most significant change technique: a supplementary evaluation tool

View/ Open
Author(s)
Choy, Sarojni
Lidstone, John
Griffith University Author(s)
Year published
2011
Metadata
Show full item recordAbstract
A primary purpose of evaluating education and training courses is to assess how well their design and delivery aspects have met the predetermined learning objectives in order to make improvements. Thus traditional approaches to evaluation provide data on what is mainly of interest to the course designers and facilitators. Consequently, conventional data collection techniques do not necessarily seek in-depth self-reflection by the learners or what is of most significance to them. Hence the real impact of course completion is not fully understood. The Most Significant Change approach to evaluation is participatory and ...
View more >A primary purpose of evaluating education and training courses is to assess how well their design and delivery aspects have met the predetermined learning objectives in order to make improvements. Thus traditional approaches to evaluation provide data on what is mainly of interest to the course designers and facilitators. Consequently, conventional data collection techniques do not necessarily seek in-depth self-reflection by the learners or what is of most significance to them. Hence the real impact of course completion is not fully understood. The Most Significant Change approach to evaluation is participatory and collects stories on the impact of the training experienced by the learners, supplementing data that provides a more holistic and richer picture of the learning outcomes from learners' perspectives. This paper reports on the use of the Most Significant Change technique to supplement data from conventional sources in order to evaluate a leadership capacity building course. Eighteen participants completed a Master of Education course over a period of two years. The Learning Experience Surveys provided mainly quantitative and some qualitative data on the students' experiences and satisfaction with teaching and learning. Stories about the most significant changes experienced by the students were recorded during interviews and confirmed at focus groups. The findings highlight the value in using the Most Significant Change as a tool for a more comprehensive evaluation of capacity building programs. Although the sample represents a group of university students, the tool offers potential for VET practitioners to extend their evaluation techniques and learn more about the impact of education and training that they offer.
View less >
View more >A primary purpose of evaluating education and training courses is to assess how well their design and delivery aspects have met the predetermined learning objectives in order to make improvements. Thus traditional approaches to evaluation provide data on what is mainly of interest to the course designers and facilitators. Consequently, conventional data collection techniques do not necessarily seek in-depth self-reflection by the learners or what is of most significance to them. Hence the real impact of course completion is not fully understood. The Most Significant Change approach to evaluation is participatory and collects stories on the impact of the training experienced by the learners, supplementing data that provides a more holistic and richer picture of the learning outcomes from learners' perspectives. This paper reports on the use of the Most Significant Change technique to supplement data from conventional sources in order to evaluate a leadership capacity building course. Eighteen participants completed a Master of Education course over a period of two years. The Learning Experience Surveys provided mainly quantitative and some qualitative data on the students' experiences and satisfaction with teaching and learning. Stories about the most significant changes experienced by the students were recorded during interviews and confirmed at focus groups. The findings highlight the value in using the Most Significant Change as a tool for a more comprehensive evaluation of capacity building programs. Although the sample represents a group of university students, the tool offers potential for VET practitioners to extend their evaluation techniques and learn more about the impact of education and training that they offer.
View less >
Conference Title
14th Annual Conference of the Australian Vocational Education and Training Researchers Association
Copyright Statement
© 2011 AVETRA. This is the author-manuscript version of this paper. Reproduced in accordance with the copyright policy of the publisher. Use hypertext link for access to the publisher's website.
Subject
Curriculum and Pedagogy not elsewhere classified