Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Relationships between objective structured clinical examination, computer-based testing, and clinical clerkship performance in Japanese medical students

Abstract

Background

It is unclear how comprehensive evaluations conducted prior to clinical clerkships (CC), such as the objective structured clinical examination (OSCE) and computer-based testing (CBT), reflect the performance of medical students in CC. Here we retrospectively analyzed correlations between OSCE and CBT scores and CC performance.

Methods

Ethical approval was obtained from our institutional review board. We analyzed correlations between OSCE and CBT scores and CC performance in 94 medical students who took the OSCE and CBT in 2017 when they were 4th year students, and who participated in the basic CC in 2018 when they were 5th year students.

Results

Total scores for OSCE and CBT were significantly correlated with CC performance (P<0.001, each). More specifically, medical interview and chest examination components of the OSCE were significantly correlated with CC performance (P = 0.001, each), while the remaining five components of the OSCE were not.

Conclusion

Our findings suggest that the OSCE and CBT play important roles in predicting CC performance in Japanese medical education context. Among OSCE components, medical interview and chest examination were suggested to be important for predicting CC performance.

Introduction

Seamless medical education in which students gradually acquire professional abilities from when they are undergraduates up until they become postgraduates is important from the perspective of outcome-based education [1]. To achieve this goal, effective clinical training methods are needed which allow for a smooth transition from undergraduate medical education to basic skill acquisition as a postgraduate [2].

In Japan, clinical clerkships (CC)s form the basis of clinical training. In contrast to conventional clinical training, which involves only observation and no practice, CC have students participate as members of a medical team to perform actual medical procedures and care. The range of medical procedures allowed to be performed by students is defined and carried out under the supervision of an instructing doctor [3]. This enables students to acquire practical clinical skills. In this regard, students are required to have a sense of identity and personal responsibility [4]. Clinical training throughout the various departments of a hospital is carried out in the form of CCs which are driven by curricula for diagnoses and treatments [5].

Assuring basic clinical competency in medical students prior to CC is essential from a medical safety perspective. In order to validate the basic clinical competency of medical students, the objective structured clinical examination (OSCE) and computer-based testing (CBT) were introduced in 2005 as standardized tests, organized by the Common Achievement Tests Organization (CATO) [http://www.cato.umin.jp/], to be taken by medical students. The OSCE evaluates clinical skills and communication skills using simulated patients and simulators [6][7], and the CBT basic clinical knowledge. The OSCE and CBT are mandatory for 4th year students in Japanese medical schools. Medical students are recognized by the Association of Japanese Medical Colleges as “student doctor” once they pass both examinations. After this certification, medical students can participate in the CC.

Although previous studies examined the CC performance by mini-clinical evaluation exercise (mini-CEX) [8][9], the relation between OSCE or CBT and CC performance has not been fully validated. Furthermore, no study has evaluated which skill components measured in the OSCE reflect student performance in CCs in Japanese medical education context.

Thus, we decided to evaluate the relationships between OSCE components or CBT and CC performance in Japanese medical education contexts. Accordingly, the present study aimed to retrospectively analyze correlations of various components of the OSCE and CBT with CC performance.

Material and methods

Ethical considerations

This study was approved by the Research Ethics Committee of Osaka Medical College (No.2806). All data were fully anonymized before accessing them and our research ethics committee waived the requirement for informed consent.

Study population

As with other medical schools in Japan, medical students of Osaka Medical College take the OSCE and CBT in their 4th year, and participate in CCs in their 5th and 6th years. The students have undergone all basic and clinical medicine lectures and skill training utilizing simulation before OSCE and CBT. Once they complete their CCs, medical students then take the graduation examination. From 2020, a post-CC OSCE will be introduced by CATO formally to evaluate clinical skills cultivated during CCs (Fig 1).

thumbnail
Fig 1. Schematic summarizing relationships between objective structured clinical examination (OSCE), computer-based testing (CBT), and clinical clerkships (CCs) at Osaka Medical College.

https://doi.org/10.1371/journal.pone.0230792.g001

Subjects of the present study were medical students of Osaka Medical College who were 4th year students in 2017 and 5th year student in 2018. We excluded students who did not advance to 5th year status in 2018.

Study measures

OSCE content and evaluation.

The OSCE evaluates various aspects of clinical competency. The OSCE included the following seven themes: medial interview, head and neck examination, chest examination, abdominal examination, neurological examination, emergency response, and basic clinical technique. The OSCE is carried out in seven stations, with one station dedicated to a 10-min medical interview, and the remaining six stations to physical examinations and basic skills in 5-min for each. In the 5 or 10 minutes, students perform core clinical skills such as medical interview and physical examination [10].

In the present study, student performance was evaluated by two examiners using a checklist. Scores on each component of the OSCE was based on the average of the scores assigned by the two examiners. Examiners evaluate the communication, medical safety, and consultation skills accordingly on the checklist. The examiners underwent about three hours evaluation training for standardization based on common text and video provided by CATO. Each student take examination in the all seven skill stations and total score was calculated by the average of seven skills. The examination also strictly checks the identification of students by validating their names and ID numbers. Examiners from other universities are routinely invited to validate internal evaluations during the OSCE.

CBT content and evaluation.

The CBT consists of multiple-choice questions and extended matching items, and students are required to answer 320 questions about basic clinical knowledge over the course of six hours. Evaluation was performed by the 240 questions which the difficulty and discriminating power was validated from the past pooling data. The remaining 80 questions were trial questions which are not used for the evaluation. The questions are standard tested by the CATO. The CBT includes clinical disciplines and related basic medicine knowledges. Scores for the CBT are machine-calculated and scoring rate was evaluated.

Clinical clerkship (CC) content and evaluation.

Medical students participate in a basic CC during their 5th year. The basic CC involves participation in CCs of all clinical departments of the hospital over the course of 32 weeks, with each CC spanning about one to two weeks in duration. Once students complete the basic CC, they must then select a discipline they wish to study for 14 weeks (Figure).

During the CCs, supervising (teaching) doctors of each department evaluate the clinical skills of students using an evaluation sheet based on the mini- CEX and Direct Observation of Procedural Skills (DOPS) [11][12]. Accomplishment consists of 5-point evaluation sheet for 16 parts (80%), subjective evaluation by the organizer of each department (10%), and written report (10%) (Fig 2).

thumbnail
Fig 2. Contents of clinical clerkships (CC) evaluation in our college.

Accomplishment consists of 5-point evaluation sheet for 16 parts (80%), subjective evaluation by the organizer (10%), and written report (10%).

https://doi.org/10.1371/journal.pone.0230792.g002

Scores for each CC are collected by the medical education center and are used to calculate an average score. In this study, we used the basic CC (32 weeks) score, since all medical students are required to participate in the basic CC.

Statistical analysis

Statistical analyses were performed using JMP® 11 (SAS Institute Inc., Cary, NC, USA). Results were compared using Pearson’s correlation test. Data are presented as mean ± SD. P < 0.05 was considered statistically significant.

Results

We analyzed scores of 94 medical students who participated in the OSCE and CBT in 2017, and the basic CC in 2018. As shown in Table 1, medical students generally achieved scores ranging from 80%-90% for the OSCE, CBT, and CC.

thumbnail
Table 1. Scores for objective structured clinical examination (OSCE), computer-based testing (CBT), and clinical clerkship (CC).

https://doi.org/10.1371/journal.pone.0230792.t001

Correlations of OSCE and CBT scores with CC scores

Correlations of OSCE and CBT scores with CC scores are shown in Table 2. Total scores for OSCE and CBT were significantly correlated with CC scores (P<0.001 each). When analyzed by OSCE components, medical interview and chest examination scores were significantly correlated with CC scores (P = 0.001, each), while the remaining five component scores were not.

thumbnail
Table 2. Correlations of objective structured clinical examination (OSCE) and computer-based testing scores with clinical clerkship (CC) scores.

https://doi.org/10.1371/journal.pone.0230792.t002

Correlations between OSCE and CBT scores

We evaluated correlations between OSCE and CBT scores, and found no significant correlations between them (Table 3). There also were no significant correlations between each OSCE component score and CBT score (Table 3).

thumbnail
Table 3. Correlations between scores of objective structured clinical examination (OSCE) components and computer-based testing (CBT).

https://doi.org/10.1371/journal.pone.0230792.t003

Discussion

Our study showed that total scores for OSCE and CBT were significantly correlated with CC scores. This suggest that OSCE and CBT can be an effective indicator of CC performance in Japanese medical education. From the specific correlation analysis, medical interview and chest examination scores were significantly correlated with CC scores.

Physical examinations and medical interviews are essential skills and being able to evaluate information from these provides information important for diagnosis and treatment during CCs [13][14]. In clinical settings, it is not rare to overlook physical findings or perform evaluations incorrectly. Incorrect assessment of physical findings can lead to errors in diagnosis, which may result in an adverse outcome for the patient [15][16]. Accordingly, from the perspectives of clinical competency and outcome-based education, assuring the quality of both technical and non-technical skills of medical students before CCs is essential.

In the present study, total scores for OSCE and CBT showed strong and significant correlations with CC performance, as reflected in CC scores. These data validate the OSCE and CBT as measures to assure competency prior to participating in CCs. Interestingly, no components of the OSCE were significantly correlated with total CBT score. This suggests that competency as evaluated using the OSCE and CBT are different, and a combination of both could provide a better sense of the competency of medical students prior to CCs.

When OSCE components were considered individually, medical interview and chest examination components were significantly correlated with CC performance, while the remaining five components were not. One potential explanation for this is that, of the seven components of the OSCE, medical interviews and chest examinations are performed the most often during CCs. Thus, focusing training on these skills may contribute to better CC performance [17][18].

In contrast, components other than medical interviews and chest examinations were not significantly correlated with CC performance. One possible reason for this is the lack of opportunities to use such skills. For example, medical students are not permitted to perform emergency response such as advanced life support alone [19][20]. As medical students should acquire these basic clinical competencies after medical doctor certification, some educational method for compensating the gap is warranted [21].

To overcome this problem, we believe that simulation-based education (SBE) can be a powerful solution to compensate for the lack of opportunities to exercise these skills [22]. As SBE methods have been developed and are widely used to acquire both technical and non-technical skills in medical education [23][24], combination of SBE and CC could potentially maximize the competency of medical students. For example, medical students can rephrase the resuscitation utilizing simulator which they watched in the emergency ward. Such combination can enhance the CC performance in medical students.

Medical educators are expected to improve CC program by including SBE method to compensate low-frequent clinical skills [25]. They can also utilize SBE for formative assessment to improve teaching and learning in clinical region.

This study has a number of limitations worth noting. First, as data were obtained from a single institution, our findings may not be generalizable to other medical schools [26][27]. However, our results likely apply to medical schools in Japan given the core medical curriculum adopted throughout the country. Second, the CBT scores are generally high and small variation, this might have caused some bias for correlation analysis. Third, we excluded the students who did not progress to perform correlation analysis more accurately, as we considered that content of CC may differ year by year. However, this may have caused bias. Fourth, we evaluated the total OSCE score by the average of seven stations which may have lacked statistical justification. Lastly, we evaluated the overall score for CC. Correlations between the OSCE and CBT and the various aspects of CC may provide further insight into how these test instruments relate to actual medical practice by medical students. In this regard, it will be interesting to evaluate the relationship between CC performance and post-CC OSCE scores once the post-OSCE is implemented in the 2020 curriculum year.

In conclusion, our findings suggest that the OSCE and CBT play important roles in predicting CC performance. In particular, medical interview and chest examination components of the OSCE were particularly relevant for predicting CC performance.

References

  1. 1. Ellaway RH, Graves L, Cummings BA (2016) Dimensions of integration, continuity and longitudinality in clinical clerkships. Med Educ 50:912–921. pmid:27562891
  2. 2. Hudson JN, Poncelet AN, Weston KM, Bushnell JA, A Farmer E (2017) Longitudinal integrated clerkships. Med Teach 39:7–13. pmid:27832713
  3. 3. Takahashi N, Aomatsu M, Saiki T, Otani T, Ban N (2018) Listen to the outpatient: qualitative explanatory study on medical students' recognition of outpatients' narratives in combined ambulatory clerkship and peer role-play. BMC Med Educ 18:229. pmid:30285712
  4. 4. Murata K, Sakuma M, Seki S, Morimoto T (2014) Public attitudes toward practice by medical students: a nationwide survey in Japan. Teach Learn Med 26:335–343. pmid:25318027
  5. 5. Okayama M, Kajii E (2011) Does community-based education increase students' motivation to practice community health care?—a cross sectional study. BMC Med Educ 11:19. pmid:21569332
  6. 6. Tagawa M, Imanaka H (2010) Reflection and self-directed and group learning improve OSCE scores. Clin Teach 7:266–270. pmid:21134204
  7. 7. Ishikawa H, Hashimoto H, Kinoshita M, Fujimori S, Shimizu T, Yano E (2006) Evaluating medical students' non-verbal communication during the objective structured clinical examination. Med Educ 40:1180–1187. pmid:17118111
  8. 8. Humphrey-Murto S, Côté M1 Pugh D, Wood TJ (2018) Assessing the Validity of a Multidisciplinary Mini-Clinical Evaluation Exercise. Teach Learn Med 30:152–161. pmid:29240463
  9. 9. Rogausch A, Beyeler C, Montagne S, Jucker-Kupper P, Berendonk C, Huwendiek S, et al. (2015) The influence of students’ prior clinical skills and context characteristics on mini-CEX scores in clerkships–a multilevel analysis. BMC Med Edu 15:208.
  10. 10. Tagawa M, Imanaka H (2010) Reflection and self-directed and group learning improve OSCE scores. Clin Teach 7:266–270. pmid:21134204
  11. 11. Lörwald AC, Lahner FM, Mooser B, Perrig M, Widmer MK, Greif R, et al. (2019) Influences on the implementation of Mini-CEX and DOPS for postgraduate medical trainees' learning: A grounded theory study. Med Teach 41:448–456. pmid:30369283
  12. 12. Lörwald AC, Lahner FM, Nouns ZM, Berendonk C, Norcini J, Greif R, et al. (2018) The educational impact of Mini-Clinical Evaluation Exercise (Mini-CEX) and Direct Observation of Procedural Skills (DOPS) and its association with implementation: A systematic review and meta-analysis. PLoS One 13:e0198009. pmid:29864130
  13. 13. Bowen JL (2006) Educational strategies to promote clinical diagnostic reasoning. N Engl J Med 355:2217–2225. pmid:17124019
  14. 14. Nomura S, Tanigawa N, Kinoshita Y, Tomoda K (2015) Trialing a new clinical clerkship record in Japanese clinical training. Adv Med Educ Pract 6:563–565. pmid:26451128
  15. 15. Shikino K, Ikusaka M, Ohira Y, Miyahara M, Suzuki S, Hirukawa M, et al. (2015) Influence of predicting the diagnosis from history on the accuracy of physical examination. Adv Med Educ Pract 6:143–148. pmid:25759604
  16. 16. Kirkman MA, Sevdalis N, Arora S, Baker P, Vincent C, Ahmed M (2015). The outcomes of recent patient safety education interventions for trainee physicians and medical students: a systematic review. BMJ Open 5:e007705. pmid:25995240
  17. 17. Kassam A, Cowan M, Donnon T (2016) An objective structured clinical exam to measure intrinsic CanMEDS roles. Med Educ Online 21:31085. pmid:27637267
  18. 18. Eftekhar H, Labaf A, Anvari P, Jamali A, Sheybaee-Moghaddam F (2012) Association of the pre-internship objective structured clinical examination in final year medical students with comprehensive written examinations. Med Educ Online 17:15958
  19. 19. Couto LB, Durand MT, Wolff ACD, Restini CBA, Faria M Jr, Romão GS, et al. (2019) Formative assessment scores in tutorial sessions correlates with OSCE and progress testing scores in a PBL medical curriculum. Med Educ Online 24:1560862. pmid:31023185
  20. 20. Dong T, Saguil A, Artino AR Jr, Gilliland WR, Waechter DM, Lopreaito J, et al. (2012) Relationship between OSCE scores and other typical medical school performance indicators: a 5-year cohort study. Mil Med 177:44–46. pmid:23029860
  21. 21. Mukohara K, Kitamura K, Wakabayashi H, Abe K, Sato J, Ban N (2004) Evaluation of a communication skills seminar for students in a Japanese medical school: a non-randomized controlled study. BMC Med Educ 4:24. pmid:15550166
  22. 22. Hough J, Levan D, Steele M, Kelly K, Dalton M (2019) Simulation-based education improves student self-efficacy in physiotherapy assessment and management of paediatric patients. BMC Med Educ 16:463.
  23. 23. Offiah G, Ekpotu LP, Murphy S, Kane D, Gordon A, O'Sullivan M, et al. (2019) Evaluation of medical student retention of clinical skills following simulation training. BMC Med Educ 19:263. pmid:31311546
  24. 24. Dong T, Zahn C, Saguil A, Swygert KA, Yoon M, Servey J, et al. (2017) The Associations Between Clerkship Objective Structured Clinical Examination (OSCE) Grades and Subsequent Performance. Teach Learn Med 29:280–285. pmid:28632015
  25. 25. Schiekirka S, Reinhardt D, Heim S, Fabry G, Pukrop T, Anders S, et al. (2012) Student perceptions of evaluation in undergraduate medical education: a qualitative study from one medical school. BMC Med Educ 12:45. pmid:22726271
  26. 26. Näpänkangas R, Karaharju-Suvanto T, Pyörälä E, Harila V, Ollila P, Lähdesmäki R, et al. (2016) Can the results of the OSCE predict the results of clinical assessment in dental education? Eur J Dent Educ 20:3–8. pmid:25470560
  27. 27. Sahu PK, Chattu VK, Rewatkar A, Sakhamuri S (2019) Best practices to impart clinical skills during preclinical years of medical curriculum. J Educ Health Promot 8:57. pmid:31008124