Analysis of 9 Years of Simulation Center Best Practices from Accreditation Reviews Using Epistemic Network Analysis. (1090-004277) (Research Abstract Oral: QI)
Start time: Monday, January 25, 2021, 2:00 PM End time: Monday, January 25, 2021, 3:00 PM Session Type: Research Abstracts (Completed Studies)
Since 2005, the American College of Surgeons Accredited Educational Institutes has provided accreditation of surgically-focussed simulation centers with the added benefit of identifying best practices defined as “areas far exceeding the accreditation standards or novel methods of advancing high quality, impactful education.” The organization began to compile all best practices from accreditation reviews in 2011 for dissemination to members through journal articles, on-line videos, newsletters, and workshops. Although this is a rich source of data for sharing innovations, the authors wanted to explore the content and associations of best practices to understand the evolution of the field, the accreditation process, and organizational perspectives over the last decade.
The compiled list of 337 best practices identified from 247 site visits over 9 years were analyzed and visualized using epistemic network analysis, or ENA. Raw text for each best practice along with the center name, date of accreditation review, and coded themes were analyzed based on research questions and overall networks were compared for additional hypothesis generation and analysis. Data from all best practices over the nine-year period was compiled into a single network analysis for an overall comparison of associations. To evaluate changes in best practice feedback from accreditation surveys over time, data was also divided into three 3-year periods, 2011-2013, 2014-2016, and 2017-2019 with means and standard deviations compared using graphical 2-dimensional network visualizations.
Overall association network of the data demonstrated the strongest associations between assessment, curriculum development, faculty development, research and teaching methods. There were also detectable associations between all content areas demonstrating the very interrelated nature of simulation-based education. Modest associations were most commonly seen involving curriculum evaluation, which persisted throughout the nine-year period. Associations during the three time periods showed statistically significant changes in mean overall associations with both a change in mean content and broader standard deviation despite an increase in best practices in each period. Early associations were mainly seen between faculty development, curriculum development, collaboration and teaching methods, but migrated to include all areas with increases seen in the areas of assessment, research, resources and governance demonstrating much broader associations.
Best practices evolved from an early focus on teaching methods, faculty and curriculum development to higher-level educational topics including assessment, research, resources and overall center governance. The increased distribution of associations also clearly demonstrates an increase in complexity of the feedback, with more nuanced and interconnected statements demonstrating higher-level feedback including explanations, contributing factors, impact on other areas and, in some cases, recommendations to share best practices outside the organization. Much like individual learner assessment, compiled longitudinal assessments can sometimes say as much about the assessors as it does the learner in terms of both written and unwritten goals, competing perspectives, and organizational priorities. This nine-year database of simulation center feedback provides a novel perspective of an organization and the evolving field of simulation in healthcare professions education.