Journal Article
Meta-Analysis
Research Support, Non-U.S. Gov't
Research Support, U.S. Gov't, P.H.S.
Add like
Add dislike
Add to saved papers

Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes?

JAMA 1999 September 2
CONTEXT: Although physicians report spending a considerable amount of time in continuing medical education (CME) activities, studies have shown a sizable difference between real and ideal performance, suggesting a lack of effect of formal CME.

OBJECTIVE: To review, collate, and interpret the effect of formal CME interventions on physician performance and health care outcomes.

DATA SOURCES: Sources included searches of the complete Research and Development Resource Base in Continuing Medical Education and the Specialised Register of the Cochrane Effective Practice and Organisation of Care Group, supplemented by searches of MEDLINE from 1993 to January 1999.

STUDY SELECTION: Studies were included in the analyses if they were randomized controlled trials of formal didactic and/or interactive CME interventions (conferences, courses, rounds, meetings, symposia, lectures, and other formats) in which at least 50% of the participants were practicing physicians. Fourteen of 64 studies identified met these criteria and were included in the analyses. Articles were reviewed independently by 3 of the authors.

DATA EXTRACTION: Determinations were made about the nature of the CME intervention (didactic, interactive, or mixed), its occurrence as a 1-time or sequenced event, and other information about its educational content and format. Two of 3 reviewers independently applied all inclusion/exclusion criteria. Data were then subjected to meta-analytic techniques.

DATA SYNTHESIS: The 14 studies generated 17 interventions fitting our criteria. Nine generated positive changes in professional practice, and 3 of 4 interventions altered health care outcomes in 1 or more measures. In 7 studies, sufficient data were available for effect sizes to be calculated; overall, no significant effect of these educational methods was detected (standardized effect size, 0.34; 95% confidence interval [CI], -0.22 to 0.97). However, interactive and mixed educational sessions were associated with a significant effect on practice (standardized effect size, 0.67; 95% CI, 0.01-1.45).

CONCLUSIONS: Our data show some evidence that interactive CME sessions that enhance participant activity and provide the opportunity to practice skills can effect change in professional practice and, on occasion, health care outcomes. Based on a small number of well-conducted trials, didactic sessions do not appear to be effective in changing physician performance.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app