Interrater reliability of injury coding in the Queensland Trauma Registry
Author(s)
Neale, Rachel
Rokkas, Philippa
McClure, Roderick
Griffith University Author(s)
Year published
2003
Metadata
Show full item recordAbstract
Background:The capacity to accurately code injury event details and use the Abbreviated Injury Scale and Injury Severity Score to group injuries according to severity, underpins the audit and review activities of the trauma registries throughout the world. In the interests of transparency and benchmarking between registries, we aimed to assess the interrater reliability of coding in the Queensland Trauma Registry. Methods:One hundred and twenty injury cases were randomly selected from the Queensland Trauma Registry database, stratified by hospital, severity and the coder who originally coded the chart. Cases were then ...
View more >Background:The capacity to accurately code injury event details and use the Abbreviated Injury Scale and Injury Severity Score to group injuries according to severity, underpins the audit and review activities of the trauma registries throughout the world. In the interests of transparency and benchmarking between registries, we aimed to assess the interrater reliability of coding in the Queensland Trauma Registry. Methods:One hundred and twenty injury cases were randomly selected from the Queensland Trauma Registry database, stratified by hospital, severity and the coder who originally coded the chart. Cases were then recoded by six coders employed by the Queensland Trauma Registry. Coding was carried out by all raters simultaneously and independently. Results:Interrater agreement between coders was high for external cause, intent, and place of injury with kappa scores for all interrater pairs being greater than 0.80, 0.58 and 0.44. Agreement between the six raters for Injury Severity Score was found to be very high (intraclass correlation coefficient of 0.9). Conclusions:The accuracy of coding in the Queensland Trauma Registry is sufficiently high to ensure that quality data are available for research, audit and review.
View less >
View more >Background:The capacity to accurately code injury event details and use the Abbreviated Injury Scale and Injury Severity Score to group injuries according to severity, underpins the audit and review activities of the trauma registries throughout the world. In the interests of transparency and benchmarking between registries, we aimed to assess the interrater reliability of coding in the Queensland Trauma Registry. Methods:One hundred and twenty injury cases were randomly selected from the Queensland Trauma Registry database, stratified by hospital, severity and the coder who originally coded the chart. Cases were then recoded by six coders employed by the Queensland Trauma Registry. Coding was carried out by all raters simultaneously and independently. Results:Interrater agreement between coders was high for external cause, intent, and place of injury with kappa scores for all interrater pairs being greater than 0.80, 0.58 and 0.44. Agreement between the six raters for Injury Severity Score was found to be very high (intraclass correlation coefficient of 0.9). Conclusions:The accuracy of coding in the Queensland Trauma Registry is sufficiently high to ensure that quality data are available for research, audit and review.
View less >
Journal Title
Emergency Medicine Australasia
Volume
15
Issue
1
Subject
Clinical sciences