(Research Abstract Professor Rounds: Group 1) Use of a Diagnostic Feedback Approach for OSCE Assessment (1090-003892)
Start time: Tuesday, January 26, 2021, 2:00 PM End time: Tuesday, January 26, 2021, 3:00 PM Session Type: Research Abstracts (Completed Studies)
After a college or school of pharmacy adopts entrustable professional activities (EPAs), milestones can be created that span the program continuum to provide a learning roadmap for students.1 As determining whether EPA standard has been met at the appropriate level requires direct observation, objective structured clinical exam (OSCEs) have the potential to observe a student’s progress towards meeting the EPA. An emphasis on formative assessment has been addressed in educational research since the 21st century. Rather than offering a total score (usually with pass/fail decision) on an OSCE to the learners without much feedback, assessment with specific feedback can identify students’ strengths and weaknesses and advance their learning.2 Therefore the research questions for our study are: (1) What is the validity evidence of the medication history and patient counseling OSCE rubrics? (2) How effectively does each rubric measure tasks students should perform as part of an OSCE?
First-year pharmacy students from two cohort years completed a skills based course designed to teach students the patient care process through the use of simulation. Formative assessment was provided throughout the semester for medication history and patient counseling skills. At the end of the semester, students participated in a four station OSCE. Two stations used standardized patients (SPs) for gathering a medication history (station 1) and counseling on a medication (station 2). Student performance for each station was assessed using a checklist developed for evaluating core pharmacy competencies. Descriptive statistics and item analysis checklists was conducted. An exploratory factor analysis was conducted to extract significant factors which represented specific domains of pharmacy students’ OSCE skills. Content validation on explaining extracted factors was conducted by one pharmacy faculty and one psychometrician. This project was an UTHSC IRB-approved study.
We applied Messick’s unified validity framework to study validity evidence. Internal consistency (Cronbach’s a) was .76 for medication history and .73 for patient counseling. Exploratory factor analysis was conducted for each checklist, seeking internal structure evidence. A 4-factor model (21 items) was obtained for the medication history checklist with 75% of variance explained. After discussion of the content and statistics results for each factor loading, factor 1 (F1) was named “Medication Review”; factor 2 (F2) was named “Medication Adherence”; factor 3 (F3) was named “Allergies and Adverse Drug Reactions”; factor 4 (F4) was named “Medication Access”. A 3-factor model (22 items) was obtained for the Patient Counseling checklist with 54% variance explained. After discussion on content and statistics results for each factor loading, F1 was named “Medication Administration Technique”; F2 was named “3 Prime Questions”; F3 was named “Medication Dosing”.
The use of exploratory factor analysis offered meaningful results for assessing pharmacy students’ subskills in their OSCEs. Pharmacy faculty can generate factor scores for each student and give specific feedback (diagnostic feedback) based on those scores. By determining specific factors that are mapped to the current curriculum and EPAs, faculty can track pharmacy student progression to meeting specific milestones in the curriculum. Evidence-based OSCE assessment with score-based, subskill feedback may help pharmacy faculty to provide specific instruction for student learning and track progress through the curriculum.