Add like
Add dislike
Add to saved papers

Discriminating Features of Narrative Evaluations of Communication Skills During an OSCE.

Construct: Authors examined the use of narrative comments for evaluation of student communications skills in a standardized, summative assessment (Objective Structured Clinical Examinations [OSCE]).

BACKGROUND: The use of narrative evaluations in workplace settings is gaining credibility as an assessment tool, but it is unknown how assessors convey judgments using narratives in high-stakes standardized assessments. The aim of this study was to explore constructs (i.e., performance dimensions), as well as linguistic strategies that assessors use to distinguish between poor and good students when writing narrative assessment comments of communication skills during an OSCE.

APPROACH: Eighteen assessors from Qatar University were recruited to write narrative assessment comments of communication skills for 14 students completing a summative OSCE. Assessors scored overall communication performance on a 5-point scale. Narrative evaluations for the top and bottom 2 performing students for each station (based on communication scores) were analyzed for linguistic strategies and constructs that informed assessment decisions.

RESULTS: Seventy-two narrative evaluations with 662 comments were analyzed. Most comments (77%) were written without the use of politeness strategies. A further 22% of comments were hedged. Hedging was used more commonly in poor performers, compared to good performers (30% vs. 15%, respectively). Overarching constructs of confidence, adaptability, patient safety, and professionalism were key dimensions that characterized the narrative evaluations of students' performance.

CONCLUSIONS: Results contribute to our understanding regarding the utility of narrative comments for summative assessment of communication skills. Assessors' comments could be characterized by the constructs of confidence, adaptability, patient safety, and professionalism when distinguishing between levels of student performance. Findings support the notion that judgments are arrived at by clustering sets of behaviors into overarching and meaningful constructs rather than by solely focusing on discrete behaviors. These results call for the development of better-anchored evaluation tools for communication assessment during OSCEs, constructively aligned with assessors' map of the reality of professional practice.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app