IMSH Delivers Sessions - Scroll down to "ADD to your Briefcase"
“Huh… Well… You’ve Got Cancer”. Paralinguistic Cues as Index of Poor Communication Skills While Breaking Bad News in Simulation Training for Medical Students. (1090-003809) (To be presented during the session entitled, Research Abstract Oral: Debriefing, Communication & Teamwork)
Start time: Friday, January 22, 2021, 8:00 AM End time: Friday, January 22, 2021, 9:00 AM Session Type: Research Abstracts (Completed Studies) Cost: $0.00
Content Category: Researcher
Hypothesis:
Establishing a positive relationship with patients and mastering communication skills is of paramount importance for clinicians, both for clinical and ethical concerns (Kaplan et al., 1989). The quality of the doctor-patient relationship is based not only on the cognitive and verbal content of the interaction, but mostly on the affective and non-verbal communication between them (Robinson, 2006; Hart et al., 2016 ). Simulation with standardized patients can be a powerful method to address these issues, especially when the communication involves breaking bad news, e.g. a chronic or life-threatening disease diagnosis (Schildmann et al., 2012). We hypothesize that non-verbal cues of anxiety (namely, paralinguistic indexes of hesitation and emotional strain) are correlated with the quality of bad news communication of medical students with standardized patients in simulation-based training.
Methods:
Three independent judges analyzed the paralinguistic cues (PLC) for stress and emotional tension in video recordings of simulation sessions where 29 medical students of the University of Genoa had to break bad news to standardized patients (15 diagnoses of diabetes and 14 diagnoses of cancer). The PLC were: tone, rhythm, silences, and hesitation markers (e.g. “erm”, “huh”, etc.), (Crane and Crane, 2010). Each video was also rated in terms of communication quality, using the Modified breaking bad news assessment scale (mBAS) (Schildmann et al., 2012). This scale evaluates the communication performance from “very good” (1) to “very poor” (5) under 5 facets: (mBAS_A) the quality of the introduction and greetings, (mBAS_B) the doctor’s communication while delivering bad news, (mBAS_C) the doctor’s capacity to elicit concerns, (mBAS_D) the quality of the information provided to the patient, (mBAS_E) the doctor’s capacity to explore patient’s concerns.
Results:
The correlations among the three raters concerning the number of PLC and the mBAS ratings were strong (r > .577, p<.001). The correlations between the overall mBAS ratings and the PLC of emotional tension were as follows: PLC-mBAS_A (r = -.19, p = .920); PLC-mBAS_B (r = .398, p = .033); PLC-mBAS_C (r = .580, p = .001); PLC-mBAS_D (r = .567, p = .001); PLC-mBAS_E (r =.074, p = .703); PLC-mBAS_Global (r = .430, p = .020). There was no significant difference in the mBAS ratings between the two types of diagnoses (cancer vs diabetes), except for the mBAS_D, i.e., the doctor has given information in a logical and ordered manner, checked whether the patient understood it, and summarized the information in a structured manner (t =-2.243, p = .033). No significant difference was found concerning the average number of PLC of the two types of diagnoses.
Conclusions:
As expected, we observed a moderate to strong positive correlation between the occurrence of PLC for stress and emotional strain and the medical students’ difficulty in delivering bad news, especially concerning the doctor’s communication skills while disclosing the diagnosis, the doctor’s capacity to elicit patient’s concerns, and the quality of the information provided to the patient. These results highlight the importance of non-verbal cues in the evaluation of the relational aspects of communication between doctors and patients. Simulation-based training could improve the debriefing phase taking into account these cues as indexes of stress in coping with difficult topics with patients. The analysis could be performed in a qualitative way during the debriefing while watching the recording, but greater advantage could come by the automatic analysis with specialized software for speech recognition and body movements capture (Hart et al., 2016).