TY - JOUR
T1 - Technology-enhanced simulation for health professions education
T2 - A systematic review and meta-analysis
AU - Cook, David A.
AU - Hatala, Rose
AU - Brydges, Ryan
AU - Zendejas, Benjamin
AU - Szostek, Jason H.
AU - Wang, Amy T.
AU - Erwin, Patricia J.
AU - Hamstra, Stanley J.
PY - 2011/9/7
Y1 - 2011/9/7
N2 - Context: Although technology-enhanced simulation has widespread appeal, its effectiveness remains uncertain. A comprehensive synthesis of evidence may inform the use of simulation in health professions education. Objective: To summarize the outcomes of technology-enhanced simulation training for health professions learners in comparison with no intervention. Data Source: Systematic search of MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous review bibliographies through May 2011. Study Selection: Original research in any language evaluating simulation compared with no intervention for training practicing and student physicians, nurses, dentists, and other health care professionals. Data Extraction: Reviewers working in duplicate evaluated quality and abstracted information on learners, instructional design (curricular integration, distributing training over multiple days, feedback, mastery learning, and repetitive practice), and outcomes. We coded skills (performance in a test setting) separately for time, process, and product measures, and similarly classified patient care behaviors. Data Synthesis: From a pool of 10 903 articles, we identified 609 eligible studies enrolling 35 226 trainees. Of these, 137 were randomized studies, 67 were nonrandomized studies with 2 or more groups, and 405 used a single-group pretest-posttest design.Wepooled effect sizes using random effects. Heterogeneity was large (I2>50%) in all main analyses. In comparison with no intervention, pooled effect sizes were 1.20 (95% CI, 1.04-1.35) for knowledge outcomes (n=118 studies), 1.14 (95% CI, 1.03- 1.25) for time skills (n=210), 1.09 (95% CI, 1.03-1.16) for process skills (n=426), 1.18 (95% CI, 0.98-1.37) for product skills (n=54), 0.79 (95% CI, 0.47-1.10) for time behaviors (n=20), 0.81 (95% CI, 0.66-0.96) for other behaviors (n=50), and 0.50 (95% CI, 0.34-0.66) for direct effects on patients (n=32). Subgroup analyses revealed no consistent statistically significant interactions between simulation training and instructional design features or study quality. Conclusion: In comparison with no intervention, technology-enhanced simulation training in health professions education is consistently associated with large effects for outcomes of knowledge, skills, and behaviors and moderate effects for patient-related outcomes.
AB - Context: Although technology-enhanced simulation has widespread appeal, its effectiveness remains uncertain. A comprehensive synthesis of evidence may inform the use of simulation in health professions education. Objective: To summarize the outcomes of technology-enhanced simulation training for health professions learners in comparison with no intervention. Data Source: Systematic search of MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous review bibliographies through May 2011. Study Selection: Original research in any language evaluating simulation compared with no intervention for training practicing and student physicians, nurses, dentists, and other health care professionals. Data Extraction: Reviewers working in duplicate evaluated quality and abstracted information on learners, instructional design (curricular integration, distributing training over multiple days, feedback, mastery learning, and repetitive practice), and outcomes. We coded skills (performance in a test setting) separately for time, process, and product measures, and similarly classified patient care behaviors. Data Synthesis: From a pool of 10 903 articles, we identified 609 eligible studies enrolling 35 226 trainees. Of these, 137 were randomized studies, 67 were nonrandomized studies with 2 or more groups, and 405 used a single-group pretest-posttest design.Wepooled effect sizes using random effects. Heterogeneity was large (I2>50%) in all main analyses. In comparison with no intervention, pooled effect sizes were 1.20 (95% CI, 1.04-1.35) for knowledge outcomes (n=118 studies), 1.14 (95% CI, 1.03- 1.25) for time skills (n=210), 1.09 (95% CI, 1.03-1.16) for process skills (n=426), 1.18 (95% CI, 0.98-1.37) for product skills (n=54), 0.79 (95% CI, 0.47-1.10) for time behaviors (n=20), 0.81 (95% CI, 0.66-0.96) for other behaviors (n=50), and 0.50 (95% CI, 0.34-0.66) for direct effects on patients (n=32). Subgroup analyses revealed no consistent statistically significant interactions between simulation training and instructional design features or study quality. Conclusion: In comparison with no intervention, technology-enhanced simulation training in health professions education is consistently associated with large effects for outcomes of knowledge, skills, and behaviors and moderate effects for patient-related outcomes.
UR - http://www.scopus.com/inward/record.url?scp=80052444806&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=80052444806&partnerID=8YFLogxK
U2 - 10.1001/jama.2011.1234
DO - 10.1001/jama.2011.1234
M3 - Review article
C2 - 21900138
AN - SCOPUS:80052444806
SN - 0098-7484
VL - 306
SP - 978
EP - 988
JO - JAMA
JF - JAMA
IS - 9
ER -