Show simple item record

dc.contributor.authorWoodcock, S
dc.contributor.authorHoward, SJ
dc.contributor.authorEhrich, J
dc.date.accessioned2019-12-02T01:32:13Z
dc.date.available2019-12-02T01:32:13Z
dc.date.issued2019
dc.identifier.issn2578-4218
dc.identifier.doi10.1037/spq0000340
dc.identifier.urihttp://hdl.handle.net/10072/389413
dc.description.abstractStandardized testing is ubiquitous in educational assessment, but questions have been raised about the extent to which these test scores accurately reflect students' genuine knowledge and skills. To more rigorously investigate this issue, the current study employed a within-subject experimental design to examine item format effects on primary school students' standardized assessment results in literacy, reading comprehension, and numeracy. Eighty-nine Grade 3 students (ages 8-9 years) completed tests that varied only in item format: multiple choice; open-ended; error detection and correction; explain; and, for numeracy questions, low literacy. Analyses contrasted students' performance across these conditions, as well as item response theory- derived item difficulty and ability discrimination estimates. Findings revealed that difficulty increased and accuracy decreased from multiple-choice to open-response to error-correction and explain questions. However, the most difficult item formats tended to yield the greatest discrimination across student ability levels. Despite previous findings to the contrary, lowliteracy numeracy questions did not improve student performance or reduce item difficulty. Overall, findings indicated the impact of differing methods of assessment on standardized test performance and highlighted the need for careful consideration of not only the content of assessments but also their approaches to assessment.
dc.description.peerreviewedYes
dc.languageEnglish
dc.language.isoeng
dc.publisherAmerican Psychological Association
dc.publisher.placeUnited States
dc.relation.ispartofjournalSchool Psychology
dc.subject.fieldofresearchSpecialist Studies in Education
dc.subject.fieldofresearchcode1303
dc.titleA Within-Subject Experiment of Item Format Effects on Early Primary Students' Language, Reading, and Numeracy Assessment Results
dc.typeJournal article
dc.type.descriptionC1 - Articles
dcterms.bibliographicCitationWoodcock, S; Howard, SJ; Ehrich, J, A Within-Subject Experiment of Item Format Effects on Early Primary Students' Language, Reading, and Numeracy Assessment Results, School Psychology, 2019
dc.date.updated2019-11-27T03:51:05Z
gro.description.notepublicThis publication has been entered into Griffith Research Online as an Advanced Online Version
gro.hasfulltextNo Full Text
gro.griffith.authorWoodcock, Stuart


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

  • Journal articles
    Contains articles published by Griffith authors in scholarly journals.

Show simple item record