Medical education
-
Review Meta Analysis
Debriefing for technology-enhanced simulation: a systematic review and meta-analysis.
Debriefing is a common feature of technology-enhanced simulation (TES) education. However, evidence for its effectiveness remains unclear. We sought to characterise how debriefing is reported in the TES literature, identify debriefing features that are associated with improved outcomes, and evaluate the effectiveness of debriefing when combined with TES. ⋯ Limited evidence suggests that video-assisted debriefing yields outcomes similar to those of non-video-assisted debriefing. Other debriefing design features show mixed or non-significant results. As debriefing characteristics are usually incompletely reported, future debriefing research should describe all the key debriefing characteristics along with their associated descriptors.
-
Working effectively in interprofessional teams is a core competency for all health care professionals, yet there is a paucity of instruments with which to assess the associated skills. Published medical teamwork skills assessment tools focus primarily on high-acuity situations, such as cardiopulmonary arrests and crisis events in operating rooms, and may not generalise to non-high-acuity environments, such as in-patient wards and out-patient clinics. ⋯ Our study delineates essential elements of teamwork in low-acuity settings, including desirable attributes of team members, thus laying the foundation for the development of an individual teamwork skills assessment tool.
-
The shift from a time-based to a competency-based framework in medical education has created a need for frequent formative assessments. Many educational programmes use some form of written progress test to identify areas of strength and weakness and to promote continuous improvement in their learners. However, the role of performance-based assessments, such as objective structured clinical examinations (OSCEs), in progress testing remains unclear. ⋯ Scores were found to have high reliability and demonstrated significant differences in performance by year of training. This provides evidence for the validity of using scores achieved on an OSCE as markers of progress in learners at different levels of training. Future studies will focus on assessing individual progress on the OSCE over time.