Evaluation of clinical competence : The gap between expectation and performance.
To evaluate a 3-year experience with the Objective Structured Clinical Examinations (OSCEs) and to compare faculty expectations with resident performance.
Descriptive analysis of measures of resident performance.
Community-based pediatric residency program in Michigan.
One hundred twenty-six pediatric residents at all levels of training.
The three examinations consisted of 36 to 42 5-minute stations, testing skills in physical examination, history, counseling, telephone management, and test interpretation.
A committee of faculty and chief residents predetermined minimum pass levels for each resident level.
Results were compared with other indices of resident performance.
There was evidence for content, construct, and concurrent validity, as well as a high degree of reliability.
However, 40% to 96% of residents scored below the minimum pass levels for their levels.
In each examination, third-year residents had the highest failure rates, yet they scored well on the American Board of Pediatrics in-training examination and on their monthly clinical evaluations.
Furthermore, for residents at all levels, the scores reflecting application of data were significantly lower than those assessing data gathering.
Mots-clés Pascal : Formation professionnelle, Contrôle connaissance, Enseignement universitaire, Résident, Interne(étudiant), Médecine, Evaluation, Evaluation performance, Méthodologie, Homme
Mots-clés Pascal anglais : Occupational training, Examination, Higher education, Resident, Resident(student), Medicine, Evaluation, Performance evaluation, Methodology, Human
Notice produite par :
Inist-CNRS - Institut de l'Information Scientifique et Technique
Cote : 97-0378570
Code Inist : 002B30A09. Création : 12/09/1997.