We have located open access text paper links.
Inter-Rater and Intrarater Reliability of Radiographs in the Diagnosis of Pediatric Scaphoid Fractures.
BACKGROUND: Pediatric scaphoid fractures can be challenging to diagnose on plain radiograph. Rates of missed scaphoid fractures can be as high as 30% to 37% on initial imaging and overall sensitivity ranging from 21% to 97%. Few studies, however, have examined the reliability of radiographs in the diagnosis of scaphoid fractures, and none are specific to the pediatric population. Reliability, both between different specialists and for individual raters, may elucidate some of the diagnostic challenges.
METHODS: We conducted a 2-iteration survey of pediatric orthopedic surgeons, plastic surgeons, radiologists, and emergency physicians at a tertiary children's hospital. Participants were asked to assess 10 series of pediatric wrist radiographs for evidence of scaphoid fracture. Inter-rater and intrarater reliability was calculated using the intraclass correlation coefficient of 2.1.
RESULTS: Forty-two respondents were included in the first iteration analysis. Inter-rater reliability between surgeons (0.66; 95% confidence interval, 0.43-0.87), radiologists (0.76; 0.55-0.92), and emergency physicians (0.65; 0.46-0.86) was "good" to "excellent." Twenty-six respondents participated in the second iteration for intrarater reliability (0.73; 0.67-0.78). Sensitivity (0.75; 0.69-0.81) and specificity (0.78; 0.71-0.83) of wrist radiographs for diagnosing scaphoid fractures were consistent with results in other studies.
CONCLUSIONS: Both inter-rater and intrarater reliability for diagnosing pediatric scaphoid fractures on radiographs was good to excellent. No significant difference was found between specialists. Plain radiographs, while useful for obvious scaphoid fractures, are unable to reliably rule out subtle fractures routinely. Our study demonstrates that poor sensitivity stems from the test itself, and not rater variability.
METHODS: We conducted a 2-iteration survey of pediatric orthopedic surgeons, plastic surgeons, radiologists, and emergency physicians at a tertiary children's hospital. Participants were asked to assess 10 series of pediatric wrist radiographs for evidence of scaphoid fracture. Inter-rater and intrarater reliability was calculated using the intraclass correlation coefficient of 2.1.
RESULTS: Forty-two respondents were included in the first iteration analysis. Inter-rater reliability between surgeons (0.66; 95% confidence interval, 0.43-0.87), radiologists (0.76; 0.55-0.92), and emergency physicians (0.65; 0.46-0.86) was "good" to "excellent." Twenty-six respondents participated in the second iteration for intrarater reliability (0.73; 0.67-0.78). Sensitivity (0.75; 0.69-0.81) and specificity (0.78; 0.71-0.83) of wrist radiographs for diagnosing scaphoid fractures were consistent with results in other studies.
CONCLUSIONS: Both inter-rater and intrarater reliability for diagnosing pediatric scaphoid fractures on radiographs was good to excellent. No significant difference was found between specialists. Plain radiographs, while useful for obvious scaphoid fractures, are unable to reliably rule out subtle fractures routinely. Our study demonstrates that poor sensitivity stems from the test itself, and not rater variability.
Full text links
Related Resources
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app