• myGriffith
    • Staff portal
    • Contact Us⌄
      • Future student enquiries 1800 677 728
      • Current student enquiries 1800 154 055
      • International enquiries +61 7 3735 6425
      • General enquiries 07 3735 7111
      • Online enquiries
      • Staff phonebook
    View Item 
    •   Home
    • Griffith Research Online
    • Conference outputs
    • View Item
    • Home
    • Griffith Research Online
    • Conference outputs
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

  • All of Griffith Research Online
    • Communities & Collections
    • Authors
    • By Issue Date
    • Titles
  • This Collection
    • Authors
    • By Issue Date
    • Titles
  • Statistics

  • Most Popular Items
  • Statistics by Country
  • Most Popular Authors
  • Support

  • Contact us
  • FAQs
  • Admin login

  • Login
  • Using qualitative student evaluation data to illuminate quantitative scale item ratings: seeing beyond the numbers

    Thumbnail
    View/Open
    Hall441125-Published.pdf (666.9Kb)
    File version
    Version of Record (VoR)
    Author(s)
    Palmer, Stuart
    Hall, Wayne
    Griffith University Author(s)
    Hall, Wayne
    Year published
    2019
    Metadata
    Show full item record
    Abstract
    CONTEXT In 2014, a new course (subject) in design and manufacturing with composite materials was offered at the Griffith School of Engineering. A previous evaluation of the initial course offering found that the students generally perceived the course to be valuable, and it also offered insights into potential areas for improvement. Over the subsequent three offerings of the course (2016, 2017 and 2018), a range of deliberate changes to the course learning design were made, with the aim of improving student learning and engagement. An inspection of the data from the university student evaluation instrument for the first ...
    View more >
    CONTEXT In 2014, a new course (subject) in design and manufacturing with composite materials was offered at the Griffith School of Engineering. A previous evaluation of the initial course offering found that the students generally perceived the course to be valuable, and it also offered insights into potential areas for improvement. Over the subsequent three offerings of the course (2016, 2017 and 2018), a range of deliberate changes to the course learning design were made, with the aim of improving student learning and engagement. An inspection of the data from the university student evaluation instrument for the first four course offerings showed essentially no change in mean ratings for the quantitative scale items, even though aspects of the course learning design had been deliberately changed. PURPOSE OR GOAL Limitations with the ability of quantitative scale item student evaluation data to reveal meaningful variation in response to changes in learning designs are described in the literature. The university student evaluation also included the option for students to provide open-ended text comments about the course. An investigation was undertaken to determine if computer-based analysis of the student comments (text analytics) could identify differences in the students’ perceptions of the course that related to the changes in the course learning design over the first four years of offer. APPROACH OR METHODOLOGY/METHODS Ethics approval was sought and obtained the conduct research on the student evaluation data. The quantitative scale item data were analysed to identify any significant differences in the mean student ratings over the four-year period. A text analytics method (multidimensional scaling) was applied to the student open-ended comments to identify and visualise any themes by year that might reveal differences in the students’ perception of the first four offerings of the course. ACTUAL OR ANTICIPATED OUTCOMES The statistical analysis of the quantitative scale item data showed no significant differences in the mean student ratings over the four-year period. The text analytics analysis revealed distinct clusters of terms in the student open-ended comments by year. The term clusters observed did capture aspects of the intentional changes to the learning design over the first four years of offer of the course, providing some evidence that students did actually perceive these course changes. CONCLUSIONS/RECOMMENDATIONS/SUMMARY The text analytics method did reveal additional useful course evaluation information present in the open-ended comments provided by students. This qualitative data source, which many evaluation instruments routinely collect, offered a valuable complement to the quantitative scale item data which are traditionally the focus in institutional student evaluation of teaching processes. KEYWORDS Student evaluation of teaching, open-ended comments, text analytics.
    View less >
    Conference Title
    Australasian Association for Engineering Education Conference 2019 (AAEE)
    Publisher URI
    https://search.informit.com.au/documentSummary;dn=066631523845775;res=IELENG
    Copyright Statement
    © The Author(s) 2020. The attached file is posted here with permission of the copyright owner(s) for your personal use only. No further distribution permitted. For information about this conference please refer to the conference’s website or contact the author(s).
    Subject
    Curriculum and pedagogy
    Publication URI
    http://hdl.handle.net/10072/396883
    Collection
    • Conference outputs

    Footer

    Disclaimer

    • Privacy policy
    • Copyright matters
    • CRICOS Provider - 00233E

    Tagline

    • Gold Coast
    • Logan
    • Brisbane - Queensland, Australia
    First Peoples of Australia
    • Aboriginal
    • Torres Strait Islander