Add like
Add dislike
Add to saved papers

Expanded Access to Video-Based Laparoscopic Skills Assessments: Ease, Reliability, and Accuracy.

OBJECTIVE: Video-based performance assessments provide essential feedback to surgical residents, but in-person and remote video-based assessment by trained proctors incurs significant cost. We aimed to determine the reliability, accuracy, and difficulty of untrained attending staff surgeon raters completing video-based assessments of a basic laparoscopic skill. Secondarily, we aimed to compare reliability and accuracy between 2 different types of assessment tools.

DESIGN: An anonymous survey was distributed electronically to surgical attendings via a national organizational listserv. Survey items included demographics, rating of video-based assessment experience (1 = have never completed video-based assessments, 5 = often complete video-based assessments), and rating of favorability toward video-based and in-person assessments (0 = not favorable, 100 = favorable). Participants watched 2 laparoscopic peg transfer performances, then rated each performance using an Objective Structured Assessment of Technical Skill (OSATS) form and the McGill Inanimate System for Training and Evaluation of Laparoscopic Skills (MISTELS). Participants then rated assessment completion ease (1 = Very Easy, 5 = Very Difficult).

SETTING: National survey of practicing surgeons.

PARTICIPANTS: Sixty-one surgery attendings with experience in laparoscopic surgery from 10 institutions participated as untrained raters. Six experienced laparoscopic skills proctors participated as expert raters.

RESULTS: Inter-rater reliability was substantial for both OSATS (k = 0.75) and MISTELS (k = 0.85). MISTELS accuracy was significantly higher than that of OSATS (κ: MISTELS = 0.18, 95%CI = [0.06,0.29]; OSATS = 0.02, 95%CI = [-0.01,0.04]). While participants were inexperienced with completing video-based assessments (median = 1/5), they perceived video-based assessments favorably (mean = 73.4) and felt assessment completion was "Easy" on average.

CONCLUSIONS: We demonstrate that faculty raters untrained in simulation-based assessments can successfully complete video-based assessments of basic laparoscopic skills with substantial inter-rater reliability without marked difficulty. These findings suggest an opportunity to increase access to feedback for trainees using video-based assessment of fundamental skills in laparoscopic surgery.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app