Using qualitative student evaluation data to illuminate quantitative scale item ratings: seeing beyond the numbers

Loading...
Thumbnail Image
File version

Version of Record (VoR)

Author(s)
Palmer, Stuart
Hall, Wayne
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
2019
Size
File type(s)
Location

Brisbane, Australia

License
Abstract

CONTEXT In 2014, a new course (subject) in design and manufacturing with composite materials was offered at the Griffith School of Engineering. A previous evaluation of the initial course offering found that the students generally perceived the course to be valuable, and it also offered insights into potential areas for improvement. Over the subsequent three offerings of the course (2016, 2017 and 2018), a range of deliberate changes to the course learning design were made, with the aim of improving student learning and engagement. An inspection of the data from the university student evaluation instrument for the first four course offerings showed essentially no change in mean ratings for the quantitative scale items, even though aspects of the course learning design had been deliberately changed. PURPOSE OR GOAL Limitations with the ability of quantitative scale item student evaluation data to reveal meaningful variation in response to changes in learning designs are described in the literature. The university student evaluation also included the option for students to provide open-ended text comments about the course. An investigation was undertaken to determine if computer-based analysis of the student comments (text analytics) could identify differences in the students’ perceptions of the course that related to the changes in the course learning design over the first four years of offer. APPROACH OR METHODOLOGY/METHODS Ethics approval was sought and obtained the conduct research on the student evaluation data. The quantitative scale item data were analysed to identify any significant differences in the mean student ratings over the four-year period. A text analytics method (multidimensional scaling) was applied to the student open-ended comments to identify and visualise any themes by year that might reveal differences in the students’ perception of the first four offerings of the course. ACTUAL OR ANTICIPATED OUTCOMES The statistical analysis of the quantitative scale item data showed no significant differences in the mean student ratings over the four-year period. The text analytics analysis revealed distinct clusters of terms in the student open-ended comments by year. The term clusters observed did capture aspects of the intentional changes to the learning design over the first four years of offer of the course, providing some evidence that students did actually perceive these course changes. CONCLUSIONS/RECOMMENDATIONS/SUMMARY The text analytics method did reveal additional useful course evaluation information present in the open-ended comments provided by students. This qualitative data source, which many evaluation instruments routinely collect, offered a valuable complement to the quantitative scale item data which are traditionally the focus in institutional student evaluation of teaching processes. KEYWORDS Student evaluation of teaching, open-ended comments, text analytics.

Journal Title
Conference Title

Australasian Association for Engineering Education Conference 2019 (AAEE)

Book Title
Edition
Volume
Issue
Thesis Type
Degree Program
School
DOI
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement

© The Author(s) 2020. The attached file is posted here with permission of the copyright owner(s) for your personal use only. No further distribution permitted. For information about this conference please refer to the conference’s website or contact the author(s).

Item Access Status
Note
Access the data
Related item(s)
Subject

Curriculum and pedagogy

Persistent link to this record
Citation

Palmer, S; Hall, W, Using qualitative student evaluation data to illuminate quantitative scale item ratings: seeing beyond the numbers, Australasian Association for Engineering Education Conference 2019 (AAEE), 2019