We have located links that may give you full text access.
COMPARATIVE STUDY
EVALUATION STUDIES
JOURNAL ARTICLE
Assessing and documenting general competencies in otolaryngology resident training programs.
Laryngoscope 2006 May
OBJECTIVES: The objectives of this study were to: 1) implement web-based instruments for assessing and documenting the general competencies of otolaryngology resident education, as outlined by the Accreditation Council of Graduate Medical Education (ACGME); and 2) examine the benefit and validity of this online system for measuring educational outcomes and for identifying insufficiencies in the training program as they occur.
METHODS: We developed an online assessment system for a surgical postgraduate education program and examined its feasibility, usability, and validity. Evaluations of behaviors, skills, and attitudes of 26 residents were completed online by faculty, peers, and nonphysician professionals during a 3-year period. Analyses included calculation and evaluation of total average performance scores of each resident by different evaluators. Evaluations were also compared with American Board of Otolaryngology-administered in-service examination (ISE) scores for each resident. Convergent validity was examined statistically by comparing ratings among the different evaluator types.
RESULTS: Questionnaires and software were found to be simple to use and efficient in collecting essential information. From July 2002 to June 2005, 1,336 evaluation forms were available for analysis. The average score assigned by faculty was 4.31, significantly lower than that by nonphysician professionals (4.66) and residents evaluating peers (4.63) (P < .001), whereas scores were similar between nonphysician professionals and resident peers. Average scores between faculty and nonphysician groups showed correlation in constructs of communication and relationship with patients, but not in those of professionalism and documentation. Correlation was observed in respect for patients but not in medical knowledge between faculty and resident peer groups. Resident ISE scores improved in the third year of the study and demonstrated high correlation with faculty perceptions of medical knowledge (r = 0.65, P = .007).
CONCLUSIONS: Compliance for completion of forms was 97%. The system facilitated the educational management of our training program along multiple dimensions. The small perceptual differences among a highly selected group of residents have made the unambiguous validation of the system challenging. The instruments and approach warrant further study. Improvements are likely best achieved in broad consultation among other otolaryngology programs.
METHODS: We developed an online assessment system for a surgical postgraduate education program and examined its feasibility, usability, and validity. Evaluations of behaviors, skills, and attitudes of 26 residents were completed online by faculty, peers, and nonphysician professionals during a 3-year period. Analyses included calculation and evaluation of total average performance scores of each resident by different evaluators. Evaluations were also compared with American Board of Otolaryngology-administered in-service examination (ISE) scores for each resident. Convergent validity was examined statistically by comparing ratings among the different evaluator types.
RESULTS: Questionnaires and software were found to be simple to use and efficient in collecting essential information. From July 2002 to June 2005, 1,336 evaluation forms were available for analysis. The average score assigned by faculty was 4.31, significantly lower than that by nonphysician professionals (4.66) and residents evaluating peers (4.63) (P < .001), whereas scores were similar between nonphysician professionals and resident peers. Average scores between faculty and nonphysician groups showed correlation in constructs of communication and relationship with patients, but not in those of professionalism and documentation. Correlation was observed in respect for patients but not in medical knowledge between faculty and resident peer groups. Resident ISE scores improved in the third year of the study and demonstrated high correlation with faculty perceptions of medical knowledge (r = 0.65, P = .007).
CONCLUSIONS: Compliance for completion of forms was 97%. The system facilitated the educational management of our training program along multiple dimensions. The small perceptual differences among a highly selected group of residents have made the unambiguous validation of the system challenging. The instruments and approach warrant further study. Improvements are likely best achieved in broad consultation among other otolaryngology programs.
Full text links
Related Resources
Trending Papers
Heart failure with preserved ejection fraction: diagnosis, risk assessment, and treatment.Clinical Research in Cardiology : Official Journal of the German Cardiac Society 2024 April 12
Proximal versus distal diuretics in congestive heart failure.Nephrology, Dialysis, Transplantation 2024 Februrary 30
World Health Organization and International Consensus Classification of eosinophilic disorders: 2024 update on diagnosis, risk stratification, and management.American Journal of Hematology 2024 March 30
Efficacy and safety of pharmacotherapy in chronic insomnia: A review of clinical guidelines and case reports.Mental Health Clinician 2023 October
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app