COMPARATIVE STUDY
JOURNAL ARTICLE
RESEARCH SUPPORT, NON-U.S. GOV'T
Add like
Add dislike
Add to saved papers

Does feedback on examination performance help directors of internal medicine residencies evaluate the medical knowledge of their residents against national norms?

Academic Medicine 1994 December
BACKGROUND: As part of the admission process to the American Board of Internal Medicine's certifying examination in internal medicine, training program directors evaluate residents in several components of clinical competence, including medical knowledge. Research suggested that these ratings had different meanings across programs. A report comparing certifying examination performance and ratings of medical knowledge at the program and national levels was developed and sent to program directors after the 1988 through 1992 examinations. The present study investigated whether feedback helped program directors identify where their residents ranked nationally.

METHOD: Subjects were first-time takers of the 1986 through 1992 certifying examinations in internal medicine who took the examination in the year they completed training and who received ratings of 4 through 9 in medical knowledge. All subjects were from programs contributing examinees in all seven study years and that received feedback in 1988 through 1991. Year-by-year distributions of program mean percentages of examinees receiving each rating of medical knowledge (4 through 9) were generated. Program means for equated examination scores and ratings of medical knowledge were computed for each year. Correlations between program mean scores and ratings were also computed.

RESULTS: The distributions of the ratings were stable across the study years. Mean scores declined while mean ratings were unchanged. At the same time, correlations between scores and ratings increased. The biggest one-year change was from 1989 to 1990 (.49 to .57).

CONCLUSION: Since equated scores are directly comparable, declining mean scores but unchanged mean ratings suggest that the standards applied by program directors drifted downward. The increasing correlations suggest that program directors improved in their abilities to evaluate residents relative to a common standard. It is not clear what effect the feedback had on program directors' evaluations. It is encouraging, however, to see a higher level of agreement among program directors on the meaning of the ratings.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app