Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Improved clinical communication OSCE scores after simulation-based training: Results of a comparative study

  • Alexandre Nuzzo ,

    Contributed equally to this work with: Alexandre Nuzzo, Alexy Tran-Dinh

    Roles Conceptualization, Data curation, Methodology, Writing – original draft

    al.nuzzo@gmail.com

    Affiliation Université de Paris, Faculté de Médecine Paris-Diderot, Paris, France

  • Alexy Tran-Dinh ,

    Contributed equally to this work with: Alexandre Nuzzo, Alexy Tran-Dinh

    Roles Conceptualization, Data curation, Investigation, Writing – original draft

    Affiliation Université de Paris, Faculté de Médecine Paris-Diderot, Paris, France

  • Marie Courbebaisse,

    Roles Conceptualization, Investigation, Methodology, Writing – review & editing

    Affiliation Université de Paris, Faculté de Médecine Paris-Descartes, Paris, France

  • Hugo Peyre,

    Roles Methodology, Supervision, Writing – review & editing

    Affiliation Université de Paris, Faculté de Médecine Paris-Diderot, Paris, France

  • Patrick Plaisance,

    Roles Methodology, Supervision, Validation, Writing – review & editing

    Affiliation Université de Paris, Faculté de Médecine Paris-Diderot, Paris, France

  • Alexandre Matet,

    Roles Formal analysis, Methodology, Validation, Writing – review & editing

    Affiliation Université de Paris, Faculté de Médecine Paris-Descartes, Paris, France

  • Brigitte Ranque,

    Roles Supervision, Validation, Writing – review & editing

    Affiliation Université de Paris, Faculté de Médecine Paris-Descartes, Paris, France

  • Albert Faye,

    Roles Methodology, Supervision, Validation, Writing – review & editing

    Affiliation Université de Paris, Faculté de Médecine Paris-Diderot, Paris, France

  • Victoire de Lastours,

    Roles Supervision, Validation, Visualization, Writing – original draft

    Affiliation Université de Paris, Faculté de Médecine Paris-Diderot, Paris, France

  • on behalf of the University of Paris OSCE and SBT groups

    Membership of the University of Paris OSCE and SBT groups is provided in the Acknowledgments.

Abstract

Objectives

Simulation-based training (SBT) is increasingly used to teach clinical patient-doctor communication skills (CS) to medical students. However, the long-lasting impact of this training has been poorly studied.

Methods

In this observational study we included all fourth-year undergraduate medical students from a French medical school who undertook a CS objective structured clinical examination (OSCE) and who answered a post-examination survey. OSCE scores and students’ feedback were compared by whether students had received a specific CS-SBT or not 12 months prior to the OSCE.

Results

A total of 173 students were included in the study. Of them, 97 (56%) had followed the CS-SBT before the OSCE. Students who had undergone CS-SBT had significantly higher CS-OSCE scores in the multivariate analysis compared to untrained students (mean score 7.5/10 ±1.1 vs. 7.0/10 ±1.6, respectively, Cohen’s d = 0.4, p<0.01). They also tended to experience less nervousness during the OSCE (p = 0.09) and increased motivation to further train in “real-life” internships (p = 0.08). However, they overall expressed a general lack of CS in therapeutic patient education, delivering bad news, and disclosing medical errors.

Conclusions

Fourth-year medical students who benefited from a CS-SBT 12 months before examination displayed higher CS-OSCE scores than their counterparts.

Practice implications

These results support the early introduction of practical training to improve communication skills in undergraduate medical curricula. Studies are required to assess the sustainability of this improvement over time and its effect on further real doctor-patient communication.

Introduction

Effective doctor-patient communication is an essential physician’s skill. In recent years, the medical literature has addressed the importance of improving physician‐patient interaction [1]. Studies that have analyzed the role of physicians’ communication and their ability to adapt to the patient’s personality have shown improved patient satisfaction and treatment outcomes among physicians trained in communication skills (CS) [2].

Simulation-based training (SBT) could be a useful tool to improve CS [35]. Several studies have reported improvements of CS by SBT and when taught early in undergraduate medical curricula [610]. However, evidence remains scarce about the potential lasting effect of SBT [4].

Developed in 1975 by Harden and colleagues, the objective structure clinical examination (OSCE) is now widely used to assess clinical competence in medical education and provides a unique tool for evaluating the impact of different training programs [11]. In particular, OSCE is preferred over paper-and-pencil tests of knowledge to assess clinical CS using interactions with standardized patients.

In France, the implementation of CS simulation-based training (CS-SBT) is novel and many students still do not have access to this training. In our university, CS-SBT has been offered from 2017 as an optional course for third-year students. The aim of this observational study was to compare undergraduate medical students OSCE scores according to whether they had received CS-SBT 12 months before or not.

Methods

Study design

We conducted an observational cohort study at the University of Paris medical school, which has two main campuses (Paris Nord and Paris Centre). All fourth-year undergraduate medical students who undertook an OSCE using standardized patients as part as their mandatory training in May 2019 and who answered to the anonymous post-examination survey were included (flowchart, Fig 1). The study was approved by the education council and review board of University of Paris and informed consent was waived (data analyzed anonymously).

thumbnail
Fig 1. Flowchart of the communication skills OSCE participants.

Abbreviations: OSCE: objective structured clinical examination.

https://doi.org/10.1371/journal.pone.0238542.g001

Clinical communication skills educational programs

During the 3rd year of their training, medical students had been offered optional courses on CS:

  • One of the following two SBT programs: SBT offered by Paris Centre campus including role-playing for 1.5 hours and consultation with standardized simulated patients for 1 hour, or SBT offered by Paris Nord campus delivered in a dedicated health simulation center and consisting of a 3-hour session including two clinical scenarios of 15min with simulated patients, two observing students and one supervising teacher, with personalized and global feedback. Briefly, students played their role of medical students meeting a patient for the first time and had to take his medical history. A senior medical doctor role-played the patient. Both tutors and other students had the opportunity to evaluate the relational dynamics in the role-play by observing their interaction through a one-way screen. After the role-play, they debriefed the content and way of communicating in a structured way.
  • A theoretical CS training by conventional lectures (2 hours).

No other CS educational programs took place between the CS-SBT followed during their 3rd year and the OSCE undertook during their 4th year of medical studies, apart from the continuous bed-side training received during their internships at teaching hospitals.

OSCE setting and case scenario

The OSCE included four 7-minute stations. For each OSCE station, two faculty examiners were assigned; one role-played the standardized patient, and the other one was the evaluator. All faculty examiners had undergone a role-playing and evaluation training. This training included 1) an on-site course about the OSCE process and their role as examiners and simulated patient and 2) an online video showing how the clinical scenario should be played to standardize the patient’s part.

Briefly, the CS-OSCE station case scenario had the students manage the patient’s stress on the day before a scheduled cholecystectomy (S1 and S2 Data). Eighty-eight faculty examiners (44 standardized patients and 44 evaluators) scored the medical students’ CS through this specific OSCE station on the doctor-patient relationship, following a standardized rating scale derived from the Calgary Cambridge modified guide (Table 1) [12]. This standardized rating scale had been pre-tested by medical teachers and residents of both universities to ensure the feasibility, reliability and reproducibility of the OSCE station’s scoring system.

thumbnail
Table 1. Rating scale of the communication skills objective structured clinical examination station.

https://doi.org/10.1371/journal.pone.0238542.t001

Outcomes

The primary outcome was the CS-OSCE station scores, as measured by a standardized rating scale (total score = 10) derived from the Calgary Cambridge modified guide (Table 1). Secondary outcome measures included their experience feedback based on an anonymous online survey. The post-examination survey collected their feedback about OSCE training, organization, expectations, and needs related to CS teaching. Feedback items were rated by students via a 5-point Likert scale. For each item, mean Likert-scale score / 10 points was calculated as follows: Strong Disagreement = 0, Disagreement = 2.5, Neutral = 5, Agreement = 7.5, Strong Agreement = 10 points. Students, CS-SBT teachers and OSCE examiners were unaware of the study and blinded to the outcomes measured.

Statistical analysis

Qualitative data were reported as the number of patients (percentage of patients) and were compared using either the Pearson χ2 test or the Fisher exact test, depending on the sample size. Continuous and ordinal data such as OSCE scores and Likert scale ratings were reported as means (standard deviation) and analyzed with the Mann-Whitney U test. Missing data were not analyzed or estimated. Students’ CS-OSCE station scores and feedback (Likert scale ratings) were compared according to whether they had received prior SBT or not. An adjustment was performed using an analysis of covariance (ANCOVA) model using CS-OSCE station scores as the dependent variable, prior CS-SBT as the independent variables, and the following covariates: gender, the medical school of origin and attendance to prior conventional lectures. All tests were two-sided. A p-value < 0.05 was considered to be significant. All analyses were performed using the Statistical Package for the Social Sciences (SPSS) for Mac OSX software (version 23.0, Chicago, Illinois, USA).

Results

Participants

In May 2019, a total of 775 medical students (4th-Year of Medical School– 379 from Paris Centre campus, and 396 from Paris Nord campus) took the same 7-minute OSCE station focused on doctor-patient communication skills. A total of 173/775 medical students (22.3%) answered the post-examination online survey– 107 (61.8%) from the Paris Centre campus and 66 (38.2%) from the Paris Nord campus. It constituted the final cohort of students analyzed in this work (see flowchart, Fig 1). The survey respondents’ average CS-OSCE station score was 7.3/10 ±1.4 (vs. 7.1/10 ±1.5 for the overall 775 students who took the OSCE, one-sample t-test p = 0.04). The baseline characteristics of the 173 participants are summarized in Table 2.

thumbnail
Table 2. Baseline characteristics of the 173 fourth-year medical students.

https://doi.org/10.1371/journal.pone.0238542.t002

Communication skills simulation-based training (CS-SBT)

Of the 173 students included, 97 (56%) had received specific CS-SBT 12 months before the OSCE (Table 2). The seventy-six students (44%) who did not follow any CS-SBT included 63 (36%) who had attended the conventional lectures only and 13 (8%) with no prior training at all (Table 2). Students who had undergone CS-SBT had significantly higher CS-OSCE station scores independently from gender, medical school campus, and previous attendance of conventional lectures (mean 7.5/10 ±1.1 vs. 7.0/10 ±1.6 in non-trained students, Cohen’s d = 0.4, adjusted p<0.01) (Fig 2). No significant difference was found between students with no prior training vs. conventional lectures only (p = 0.11) or between students with lectures + CS-SBT and CS-SBT only (p = 0.54) (S3 Data). Furthermore, CS-SBT students tended to be less nervous during the OSCE (p = 0.09), requested more feedback from the examiners, and showed increased motivation to further train CS in “real-life” internships (p = 0.08) (Table 3).

thumbnail
Fig 2. Communication skills (CS) OSCE station scores according to the CS-simulation-based training (SBT) status.

Abbreviations: OSCE: objective structured clinical examination; CS-SBT: communication skills simulation-based training. The horizontal black dotted line in the boxes represent the median, and the bottom and top of the boxes the 25th and 75th percentiles, respectively. I bars represent the upper adjacent value (75th percentile plus 1.5 times the interquartile range) and the lower adjacent value (corresponding formula below the 25th percentile), and the dots outliers. *Adjusted for gender, medical school campus and prior attendance at conventional lectures (ANCOVA model).

https://doi.org/10.1371/journal.pone.0238542.g002

thumbnail
Table 3. Students communication skills OSCE station scores and feedback according to their simulation-based training status.

https://doi.org/10.1371/journal.pone.0238542.t003

Student’s feedback on CS-OSCE and SBT

Students’ survey answers are summarized in Fig 3 and Table 3. The overall mean score of the clinical scenario difficulty was 4.3/10 ±2.3, and the rating of the nervousness experience during OSCE was 5.4/10 ±2.8. The statement that SBT may be more efficient than conventional lectures on teaching CS and that SBT motivates the students to train more during their internships reached mean scores of 8.7/10 ±1.9 and 8.2/10 ±2.1, respectively. The feeling that SBT could improve their global clinical skills as a future resident or senior doctor achieved a mean rating of 8.4±2.2 /10. The CS where students felt confident included listening and avoiding jargon (respectively were rated 8.5/10 ±1.6 and 7.4/10 ±2.1) whereas therapeutic education, bad news delivery, and medical error disclosure were rated 5.9/10 ±2.3, 3.5/10 ±2.8 and 2.7/10 ±2.5, respectively. Correlation studies showed a moderate inverse association between students’ nervousness and OSCE scores (Spearman r = -0.35, p<0.001). A positive correlation was found between the quality of acting from the role-played standardized patient and OSCE scores (r = 0.28, p<0.001) (correlation table, S4 Data).

thumbnail
Fig 3. Overview of the students’ answers (Likert-scale ratings) to the post-OSCE survey (staggered bars).

https://doi.org/10.1371/journal.pone.0238542.g003

Discussion

In this comparative study of 173 fourth-year medical students from a French medical school, CS-OSCE station scores were significantly higher in students who participated in a simulation-based training program (SBT) one-year before the OSCE. This study is one of the first evaluating CS-SBT training programs delivered early in the undergraduate medical curriculum, and assessing their lasting impact 12 months after with a comparative group [4]. This training evaluation corresponds to a Kirkpatrick level 2b as the OSCE examined how well students could practically apply what they learned from the training received 12 months before [13]. Mean OSCE station scores were 0.5 points higher (corresponding to a median 1 point higher, on a total of 10 points) in students who had undergone prior CS-SBT, independently from gender, medical school, and attendance to conventional lectures (adjusted p<0.01). CS-SBT students tended to experience less nervousness and showed increased motivation to receive feedback from examiners and to further train in “real-life” internships. Additionally, as the CS-SBT training was offered one year before the OSCE, our results suggest that this teaching approach has a lasting benefit. On the other hand, we did not find any significant difference in OSCE scores from students who had received conventional lectures only compared to those who had not had any specific training. Also, no difference was found in students who had traditional lectures + CS-SBT compared to CS-SBT only. The latter result would suggest a lower efficacy of conventional lectures on CS, although the comparison was underpowered. However, student’s positive feedback claimed for more training as they believe it could enhance the whole array of their clinical expertise, including their future skills as a resident and senior doctor. Especially, irrespective of whether they had prior training or not, students rated poorly their competence in delivering therapeutic education, bad news, or medical errors, consistent with previous studies [14]. Therefore, further efforts should be made to improve the teaching and assessment of those particularly challenging clinical skills, which represent paramount needs of patients and their families. Students often claim that teaching could be improved in educational programs to enable them to develop better and maintain their communication skills [15, 16]. Several studies have supported the early introduction of practical training to improve communication skills in undergraduate medical students, which allows them to apply their theoretical knowledge continually and facilitates their learning [4, 1517]. However, further studies are required to determine the long-term effects of such programs on later professional skills.

Based on our results, the positive effect of CS-SBT may be multimodal, impacting the quality of their CS but also their nervousness, motivation, commitment, and their desire for self-improvement and self-training with real patients during their internships. Indeed, we found an upward trend in nervousness ratings in untrained students (no CS-SBT), inversely correlated with OSCE scores. Nervousness was also inversely correlated with the good-acting of the simulated patient by the faculty examiner, confirming that the authenticity and the standardization of the role-play are critical to ensure students’ equity and avoid OSCE scores biases [18].

Some limitations should be discussed. First, although OSCE were mandatory, CS-SBT was an optional course. Second, only 22% of OSCE participants answered the anonymous post-OSCE survey and were included in the analysis. This may have selected students more motivated by the SBT as a result of specific training needs or higher interest in CS, as we observed that the average OSCE station score of our sample population of 173 students was slightly higher than the entire cohort of 775 students who took the OSCE. However, groups with or without prior SBT were reasonably balanced, and the multivariate analysis controlled for most confounders. Moreover, students and examiners were unaware of the study at the time of CS-SBT and OSCE and blinded to the outcomes measured, which may have limited evaluation biases. Besides, randomised controlled trials of CS training may be difficult to implement while ensuring equity across all the students. Third, the difference in OSCE score between students with and without the SBT was mean 0.5 out of 10 points (median 1 point). Although this result was statistically significant, one may wonder whether this improvement would be noticeable from a patient perspective. Such a level of evidence is, however, rarely achieved in medical education studies. The small but lasting achievement observed in our study remains promising. Finally, although our correction grid included some medical knowledge items, communication skills elements were predominant, and there is no consensus on which correction grid to use to assess clinical communication of medical students in OSCE [19]. Indeed, the validity of the performance scores of a student is fundamentally dependent on the quality of the rating scales in use [20]. Different kinds of rating scales have been developed to assess student’s communication skills performances during an OSCE [2125]. For this OSCE station, our rating scale was developed from a Calgary Cambridge modified guide [22].

In conclusion, our results support the implementation of a practical SBT program to teach CS to undergraduate students early in the medical curricula and suggest a lasting benefit as measured by a specific OSCE undertaken 12 months later. Further studies are required to investigate the effect of SBT on other challenging communication skills such as breaking bad news and the sustainability of this improvement in future medical practice.

Supporting information

S2 Data. Instruction for the standardized patient case scenario.

https://doi.org/10.1371/journal.pone.0238542.s002

(DOCX)

S3 Data. Communication skills OSCE station scores according to the training status.

https://doi.org/10.1371/journal.pone.0238542.s003

(DOCX)

S4 Data. Spearman correlations matrix for OSCE scores and covariates.

https://doi.org/10.1371/journal.pone.0238542.s004

(DOCX)

Acknowledgments

The University of Paris OSCE and SBT groups: Pierre-Alexis Geoffroy, Patrick Plaisance, Pierre-François Ceccaldi, Damien Roux, Isabelle Etienne, Julie Deylon, Henri Duboc, Marion Strullu, Lara Zafrani, Stefania Cuzzubbo, Nicolas Javaud, Emmanuel Weiss, Ines Zara, Laurence Amar, Alexandre Matet, Ludovic Fournel, Tristan Mirault, Mehdi Oualha, Marie-Aude Piot, François Gaillard, Jean-Benoit Arlet, Stéphanie Baron, Anne-Sophie Bats, Celine Buffel de Vaure, Caroline Charlier, Eve Jablon, Natacha Kadlub, Julien Leguen, David Lebeaux, Alexandre Malmartel, Tristan Mirault, Benjamin Planquette, Alexis Régent, Jean Laurent Thebault, Guillaume Turc, Gérard Friedlander, Cécile Badoual, Philippe Ruzsniewski.

References

  1. 1. Francis V, Korsch BM, Morris MJ. Gaps in doctor-patient communication. Patients’ response to medical advice. N Engl J Med. 1969;280(10):535–40. Epub 1969/03/06. pmid:5764453.
  2. 2. Clever SL, Jin L, Levinson W, Meltzer DO. Does doctor-patient communication affect patient satisfaction with hospital care? Results of an analysis with a novel instrumental variable. Health Serv Res. 2008;43(5 Pt 1):1505–19. Epub 2008/05/08. pmid:18459954.
  3. 3. Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: a best evidence practical guide. AMEE Guide No. 82. Med Teach. 2013;35(10):e1511–30. Epub 2013/08/15. pmid:23941678.
  4. 4. Taylor S, Haywood M, Shulruf B. Comparison of effect between simulated patient clinical skill training and student role play on objective structured clinical examination performance outcomes for medical students in Australia. J Educ Eval Health Prof. 2019;16:3. Epub 2019/01/22. pmid:30665274.
  5. 5. Blackmore A, Kasfiki E, Purva M. Simulation-based education to improve communication skills: a systematic review and identification of current best practice. BMJ Simulation and Technology Enhanced Learning. 2018;4(4):159–64.
  6. 6. Colletti L, Gruppen L, Barclay M, Stern D. Teaching students to break bad news. Am J Surg. 2001;182(1):20–3. Epub 2001/09/05. pmid:11532409.
  7. 7. Gartmeier M, Bauer J, Fischer M, Hoppe-Seyler T, Karsten G, Kiessling C, et al. Fostering professional communication skills of future physicians and teachers: Effects of e-learning with video cases and role-play. Instructional Science. 2015;43:443–62.
  8. 8. Shaddeau A, Nimz A, Sheeder J, Tocce K. Effect of novel patient interaction on students’ performance of pregnancy options counseling. Med Educ Online. 2015;20:29401. Epub 2015/12/15. pmid:26654215.
  9. 9. Solomon DJ, Laird-Fick HS, Keefe CW, Thompson ME, Noel MM. Using a formative simulated patient exercise for curriculum evaluation. BMC Med Educ. 2004;4:8. Epub 2004/05/14. pmid:15140263.
  10. 10. Shapiro SM, Lancee WJ, Richards-Bentley CM. Evaluation of a communication skills program for first-year medical students at the University of Toronto. BMC Med Educ. 2009;9:11. Epub 2009/02/24. pmid:19232138.
  11. 11. Sloan DA, Donnelly MB, Schwartz RW, Strodel WE. The Objective Structured Clinical Examination. The new gold standard for evaluating postgraduate clinical performance. Ann Surg. 1995;222(6):735–42. Epub 1995/12/01. pmid:8526580.
  12. 12. Kurtz S, Silverman J, Benson J, Draper J. Marrying content and process in clinical method teaching: enhancing the Calgary-Cambridge guides. Acad Med. 2003;78(8):802–9. Epub 2003/08/14. pmid:12915371.
  13. 13. Kirkpatrick DL, Kirkpatrick JD. Evaluating training programs: The four levels. San Francisco, CA: Berrett-Koehler Publishers. 1998.
  14. 14. Eggly S, Afonso N, Rojas G, Baker M, Cardozo L, Robertson RS. An assessment of residents’ competence in the delivery of bad news to patients. Acad Med. 1997;72(5):397–9. Epub 1997/05/01. pmid:9159590.
  15. 15. Kaplan-Liss E, Lantz-Gefroh V, Bass E, Killebrew D, Ponzio NM, Savi C, et al. Teaching Medical Students to Communicate With Empathy and Clarity Using Improvisation. Acad Med. 2018;93(3):440–3. Epub 2017/10/24. pmid:29059072.
  16. 16. Kaltman S, Tankersley A. Teaching Motivational Interviewing to Medical Students: A Systematic Review. Acad Med. 2019. Epub 2019/10/03. pmid:31577585.
  17. 17. Marcus E, White R, Rubin RH. Early clinical skills training. Acad Med. 1994;69(5):415. Epub 1994/05/01. pmid:8086058.
  18. 18. Perera J, Perera J, Abdullah J, Lee N. Training simulated patients: evaluation of a training approach using self-assessment and peer/tutor feedback to improve performance. BMC Med Educ. 2009;9:37. Epub 2009/07/01. pmid:19563621.
  19. 19. Comert M, Zill JM, Christalle E, Dirmaier J, Harter M, Scholl I. Assessing Communication Skills of Medical Students in Objective Structured Clinical Examinations (OSCE)—A Systematic Review of Rating Scales. PLoS One. 2016;11(3):e0152717. Epub 2016/04/01. pmid:27031506.
  20. 20. Muller E, Zill JM, Dirmaier J, Harter M, Scholl I. Assessment of trust in physician: a systematic review of measures. PLoS One. 2014;9(9):e106844. Epub 2014/09/11. pmid:25208074.
  21. 21. Newble D. Techniques for measuring clinical competence: objective structured clinical examinations. Med Educ. 2004;38(2):199–203. Epub 2004/02/12. pmid:14871390.
  22. 22. Regehr G, MacRae H, Reznick RK, Szalay D. Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med. 1998;73(9):993–7. Epub 1998/10/06. pmid:9759104.
  23. 23. Schirmer JM, Mauksch L, Lang F, Marvel MK, Zoppi K, Epstein RM, et al. Assessing communication competence: a review of current tools. Fam Med. 2005;37(3):184–92. Epub 2005/03/02. pmid:15739134.
  24. 24. Makoul G. Essential elements of communication in medical encounters: the Kalamazoo consensus statement. Acad Med. 2001;76(4):390–3. Epub 2001/04/12. pmid:11299158.
  25. 25. Simpson M, Buckman R, Stewart M, Maguire P, Lipkin M, Novack D, et al. Doctor-patient communication: the Toronto consensus statement. BMJ. 1991;303(6814):1385–7. Epub 1991/11/30. pmid:1760608.