Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Tablet computer enhanced training improves internal medicine exam performance

Abstract

Background

Traditional teaching concepts in medical education do not take full advantage of current information technology. We aimed to objectively determine the impact of Tablet PC enhanced training on learning experience and MKSAP® (medical knowledge self-assessment program) exam performance.

Methods

In this single center, prospective, controlled study final year medical students and medical residents doing an inpatient service rotation were alternatingly assigned to either the active test (Tablet PC with custom multimedia education software package) or traditional education (control) group, respectively. All completed an extensive questionnaire to collect their socio-demographic data, evaluate educational status, computer affinity and skills, problem solving, eLearning knowledge and self-rated medical knowledge. Both groups were MKSAP® tested at the beginning and the end of their rotation. The MKSAP® score at the final exam was the primary endpoint.

Results

Data of 55 (tablet n = 24, controls n = 31) male 36.4%, median age 28 years, 65.5% students, were evaluable. The mean MKSAP® score improved in the tablet PC (score Δ + 8 SD: 11), but not the control group (score Δ- 7, SD: 11), respectively. After adjustment for baseline score and confounders the Tablet PC group showed on average 11% better MKSAP® test results compared to the control group (p<0.001). The most commonly used resources for medical problem solving were journal articles looked up on PubMed or Google®, and books.

Conclusions

Our study provides evidence, that tablet computer based integrated training and clinical practice enhances medical education and exam performance. Larger, multicenter trials are required to independently validate our data. Residency and fellowship directors are encouraged to consider adding portable computer devices, multimedia content and introduce blended learning to their respective training programs.

Introduction

Traditional teaching concepts in medical education do not take full advantage of information technology, despite the fact that modern clinical medicine and biomedical science are packed with digital media resources reaching from multidimensional virtual imaging data of the human body to complex video animations of human physiology.

Medical education ideally happens at the bedside[1], not in lecture halls. Although the use of wireless enabled mobile communication devices[2] including (tablet) computers, personal digital assistants[2] and smartphones[3]–that can help incorporate, process and deliver the ever increasing rich media and information content at the point of care in real time—is substantially increasing[46], scientific data on their efficacy in medical education and clinical training is limited.

Here we present prospective data demonstrating that Tablet PC enabled eLearning significantly impacts on exam performance and prospect for future medical trainees.

Methods

This single center, prospective, controlled study was conducted on an internal medicine ward at Charité Medical Center’s Virchow Hospital, Medical School of the Humboldt-University of Berlin. For the purpose of the study the ward was equipped with three wireless access points (Enterasys, Salem, NH, USA) linking it to the hospital’s intra- and global internet as well as a Net Education Center (Hewlett Packard, Palo Alto, CA, USA), a cart housing and charging Tablet PCs.

Active participants and controls

Participation was voluntary and in accordance with both institutional policies and all applicable laws including data privacy legislation. The institutional review board of Charité University Hospital in Berlin confirmed the information provided to participants was in line with the local ethical requirements (No.: EA1/386/16). Eligible participants included consenting medical students in their final year of medical school (acting interns) and postgraduate year 1 to 3 residents doing a rotation on the selected internal medicine ward as a mandatory part of their training curriculum. All participants signed a contract consenting to and detailing the conditions of the study. Timing and duration of their rotation were predetermined by medical school, hospital and physician board rules and regulations. Final year medical students (acting interns) interns did four month rotations, while residents did 6 month rotations. The consecutive cohort of all participants was alternatingly assigned to either the active test (tablet) or traditional education (control) group, respectively.

The active test group was profiled and examined (see below), received a Tablet PC to keep for the entire duration of their rotation and use the multimedia training and education package (see below) in- and outside the medical center campus (i.e. at home and commuting to work).

The control group did not receive a Tablet PC and was only profiled and examined (see below) and had access to all conventional education and training resources (i.e. library, books, journals) on campus.

Objectives and outcomes

The primary objective was to test the hypothesis that Tablet PC enhanced education significantly impacts on participants’ performance in medical board exams. The final MKSAP® exam score was the primary endpoint. Moreover, we aimed to identify participant’s characteristics impacting on the final exam score.

Participant profiling

Both control and tablet group participants had to complete an extensive questionnaire to collect their demographic data and evaluate their educational status, computer affinity and skills, problem solving strategy, eLearning knowledge and judge their self-estimated medical knowledge to assess potential confounding factors on the overall outcome, respectively.

Tablet computers

The HP Compaq tc4200[7] (Hewlett Packard, Palo Alto, CA, USA) and IBM ThinkPad X41[8] (IBM, Armonk, NY, USA) are ultraportable notebooks that also convert into tablets. They incorporate technology to provide wireless connectivity and improved battery performance. Tablet PCs are fully functional personal computers delivering performance and compatibility in an innovative form factor. They offer wide-viewing angle displays on protective glass featuring a digital eraser pen that writes like an actual pen.

Custom software package and programming

We developed a custom, mostly open source (Open Source Initiative, East Palo Alto, CA, USA) software package and named it Mobile Medical Educator (MME). A local Apache (Apache Software Foundation, Delaware, USA) server connected a MySQL database (Oracle, San Francisco, CA, US) with local media content, applications and a graphical user interface (GUI). The GUI was programmed using Java (Oracle, San Francisco, CA, US), CSS and HTML to provide kiosk mode web browser (Firefox, Mozilla Foundation, Mountain View, CA, USA) access for participants to interact with the Tablet PCs.

Through this central interface all participants could register, complete their profile questionnaire, take the initial and final knowledge assessment exams, access a variety of multimedia training and education resources as well as the medical center’s electronic patient care systems.

The American College of Physician℠ (Philadelphia, PA, USA) kindly provided us with a special electronic version (in XML format) of their MKSAP® 14 software that allowed integration into our database system and parsing with a random generator.

The multimedia package included access to the institutional collaborative online course management systems (Moodle and Blackboard), eBooks (Springer Nature Science and Business Media, New York, NY, USA), eJournals, educational slide kits, podcasts, videos, animations, images from major biomedical and scientific publishers or professional societies as well as twitter feeds and selected hyperlinks to biomedical and scientific web resources.

Initial and final knowledge assessment

To determine the impact of Tablet PC based education we decided to objectively assess and compare all participants’ knowledge in internal medicine at two time points. Importantly, none of the participants had access to or were able to practice the exam questions used in this study or underwent any kind of special knowledge exam preparation.

New medical knowledge recognition[9] and concept identification[10] can be computationally evaluated with the American College of Physicians℠ (ACP) Medical Knowledge Self-Assessment Program MKSAP® first introduced in the 1970s[11, 12]. MKSAP®[11] closely resembles the official American Board of Internal Medicine (ABIM) multiple choice question format and style and has been successfully used to evaluate knowledge and analyze currency of ABIM® diplomats[13]. Predictive validity for the ABIM® exam has been demonstrated in the past[14]. The internal medicine exam performance of both, the control and the tablet group was tested by administration of 215 out of 1400 random generator selected, equally distributed questions from all eleven MKSAP® categories (Foundations of Internal Medicine, Cardiovascular Medicine, Gastroenterology and Hepatology, Rheumatology, Neurology, Hematology and Oncology, Infectious Diseases, Pulmonary and Critical Care Medicine, General Internal Medicine, Endocrinology and Metabolism) parsed from the current ACP’s MKSAP® digital edition pool.

Our rationale for using MKSAP® was its proven track record in evaluating internal medicine knowledge. Although primarily designed for resident use, we felt that final year medical students, i.e. acting first year medical residents could be reliably subjected to it as well. In our opinion the benefit of using a vetted, validated questionnaire such as MKSAP® would outweigh its potential limitations and was preferable to designing a brand new knowledge assessment tool.

Data processing and statistical analysis

All statistical analyses were performed with SPSS 22 (IBM, Armonk, NY, USA) software. For descriptive statistics, means and standard deviations, medians and inter quartile ranges (IQR) or absolute and relative frequencies were reported where applicable. Data are expressed in box plots. Both, the Mann-Whitney-U-test[15] and Fisher’s-exact-test[16] were used to compare participants profile data. The t-test for independent samples or one-way ANOVA was used to test associations of participant’s characteristics with their final score. All variables with a p-value < 0.1 were also tested in a multiple regression model for the final score values. For the multiple regression model self-rated knowledge was dichotomized into excellent/good vs. passable/adequate. Additionally t-tests for related samples were employed to check for significant differences between the mean initial and final exam scores. A two-sided significance level of 0.05 was used. The main hypothesis was the existence of group differences in final scores after accounting for baseline scores and possible confounders. All other tests were secondary. No adjustment for multiple testing was applied.

Results

Participant flow and recruitment

We recruited 80 participants for this study between 2008 and 2012. Data of 55 participants (tablet n = 24, 50% male; controls, n = 31, 25.8% male; median age 28 years) were evaluable and analyzed. The remaining participants’ data was incomplete and was excluded from the analysis. Fig 1

Participant profiles

Socio-demographics.

Most participants were German nationals. There were no statistically significant differences in age, gender or educational background. Table 1

thumbnail
Table 1. Participants’ demographics.

Parameters were evaluated for statistically significant differences by either Fisher’s exact test or Mann Whitney U test#, respectively.

https://doi.org/10.1371/journal.pone.0172827.t001

Exposure to US or other foreign medical education.

A fifth of participants had received medical education in foreign countries such as Argentine, Chile, France, Iceland, Italy, Malawi, Russia, Spain, Sweden, Switzerland, The United Kingdom and The United States. However, while many participants were familiar with the term US medical licensing exam (USMLE®), only three participants had actually received medical training in the US. None had ever taken the exam. Table 1

Computer affinity and skills.

Most participants owned at least one computer which was a notebook or laptop in half of the cases. However, they mostly used it at home or work and only in less in quarter of cases in other campus locations. Table 1

Currently exposure to eLearning and preferred problem medical solving resources.

Participants’ exposure to eLearning prior to this study was very limited with one year of experience on average. Their favorite source for medical problem solving were still articles that they preferably looked up on PubMed or Google, and books. Table 1

Self-rated internal medicine knowledge.

The majority of participants in both the control and Table PC groups rated their internal medicine knowledge as “passable” or “good” at entry into the study. Only one participant (control group) rated its knowledge as excellent. Table 1

Outcomes and estimation

Improved exam performance in the tablet group.

The final mean MKSAP® score was higher in the tablet group (mean (SD): 59 (19)) compared to the control group (mean (SD): 48 (10)) (p<0.001) Table 2, Fig 2.

thumbnail
Table 2. Bivariate analysis to identify characteristics associated with final MKSAP® score.

https://doi.org/10.1371/journal.pone.0172827.t002

thumbnail
Fig 2. Statistically significant improvement of MKSAP® scores in the tablet but not the control group.

Control group (n = 31) mean MKSAP® score Δ — 7 (SD: 11). Tablet group (n = 24) mean MKSAP® score Δ + 8 (SD: 11). The overall result is also reflected in the MKSAP® median initial and final score change distribution by grouped by medical subject categories. Fig 3.

https://doi.org/10.1371/journal.pone.0172827.g002

thumbnail
Fig 3. MKSAP® score distribution.

Initial and final score change distribution by subject categories in the control (n = 31) and tablet groups (n = 24) Error bars denote 95% CI.

https://doi.org/10.1371/journal.pone.0172827.g003

Characteristics associated with improved exam performance.

At bivariate level baseline of all variables tested only tablet pc use and self-rated excellent internal medicine knowledge at baseline had a significant impact of the final exam score. Table 2 After adjustment for baseline score, tablet pc knowledge and self-rated excellent internal medicine knowledge the tablet group showed on average 11% higher MKSAP test results compared to the control group (p<0.001, main hypothesis) Table 3

thumbnail
Table 3. Multiple regression for final MKSAP® score (stepwise variable selection procedure using only significant variables in final model, adjusted for baseline score value), n = 55, R2 = 0.58.

https://doi.org/10.1371/journal.pone.0172827.t003

Discussion

We demonstrate for the first time that in a prospective cohort of final year medical students and residents doing an internal medicine inpatient service rotation at an academic medical center the use of a wireless tablet computer based integrated education and portable hospital workstation significantly improves board style exam (MKSAP®) performance. This was true even after adjustment for baseline score, Tablet PC knowledge and self-rated excellent internal medicine knowledge.

Unsurprisingly, the overall absolute MKSAP® scores at both time points and in both the control and the tablet group were lower compared with the US national average[17]. This is likely owed to the fact that none of the participants in our study were native English speakers and their exposure to US medical education was very limited. Furthermore, unlike US medical students, residents and foreign (international) medical graduates in the US none had ever taken MKSAP® or ABIM® exams before or participated in regular in-house exams with comparable questions very commonly administered in the US. Moreover, none of the participants practiced MKSAP®, USMLE® or ABIM® style exams before or during this study either.

Being naïve regarding this exam type and in relation to prior US medical education can also be considered a strength of our study. Achieving a maximum score was not the goal here, but rather to investigate if the educational tablet system would improve exam performance, i.e. has a significant impact on internal medicine knowledge, which was shown in our results.

Interestingly, the scores significantly worsened in the control group. Perhaps their motivation was lower due to the lack of the incentive of an otherwise desired technical device, which may have been an additional stimulus beyond the actual education software in the tablet computer group. While commonly employed and well proven according to some[18, 19], measurements and metrics may actually also deteriorate individual physician performance[20, 21]. Our study design was unable to detect any such an effect. The difference between Tablet PC and control group could however not be attributed to other socio-demographic factors or computer affinity surrogates either.

Our study furthermore demonstrates that the improved exam performance was significantly associated with the self-rated internal medicine baseline knowledge of participants. This appears plausible as a technical education tool can obviously not replace prior factual medical knowledge acquisition nor supersede basic pedagogic principles.

Our work has limitations. The number of evaluable cases was small and thus the data of this pilot study needs to be validated in larger series before conclusions can be generalized. We also experienced the problem of participant attrition, well known from major educational research studies[22]. The high drop-out rate may relate to the voluntary nature of study participation and perhaps conceiving the extra exams or device as a burden. At the same time the Hawthorne effect [23] also known as observer[24] effect, i.e. the reactivity in which study participants modify or improve aspects of their behavior (exam performance) in response to their awareness of being observed. We have not controlled our analysis for this effect and sample size was likely to small to address this issue. Moreover, only a computer defined random selection of 215 out of all 1400 MKSAP® questions was administered per exam. To avoid skewing of the selection their category distribution was maintained. Still, the MKSAP® edition we used was designed for residents and could have potentially overwhelmed some of the participating final year medical students.

The impact of computer enhanced education on board style exams has been studied before. One group compared scores on preceptor evaluations with National Board of Medical Examiners (NBME) Subject Exam, and a standardized patient (SP)-based exam to complete assigned web cases versus students not completing the assignment. The authors controlled for prior academic performance and clerkship timing using US Medical Licensing Exam (USMLE) Step 1 scores and rotation order. They reported that students completing the web case assignment scored higher on the NBME subject exam and the SP-based exam [25]. Another study examined the impact of a computer-based program where residents receive a score on a Likert-type scale from an attending for each precept based on their knowledge base. The authors found a significant correlation between the resident’s Likert scale scores and their American Board of Family Medicine In-Training Exam scores[26]. Judging from a study in emergency medicine it appears that positive impact of computers is probably independent of the exam style (computerized vs. oral). The authors observed no differences between virtual and traditional groups on critical action scores or scores on eight competency categories[27].

The use of multimedia materials was also studied in dental students, who often have difficulty understanding the importance of basic science classes, such as physiology, for their future careers. The authors reported a significant improvement in unit exam scores[28]

Exam performance without a practical clinical skill level assessment does not automatically translate into superior performance in residency or fellowship programs[29]. The utility of educational games (although our system was not programmed as a game) as a teaching strategy for healthcare professionals remains undetermined according to a recent Cochrane analysis[30]. Moreover, different types of physicians have different needs and preferences for evidence-based resources and handheld devices [31]. Another aspect we could not address in our study was demonstrating the link to improved patient outcomes[3234].

In summary, our study provides evidence, that tablet computer based integrated training and clinical practice enhances medical education and exam performance. Larger, multicenter trials are required to independently validate our data. Residency and fellowship directors are encouraged to consider adding computer devices, multimedia content and introduce blended learning to their respective training programs.

Supporting information

S1 File. Raw data.

This file contains the study raw data, except for any potentially personally identifying information to meet German Federal privacy legislation requirements.

https://doi.org/10.1371/journal.pone.0172827.s001

(SAV)

Acknowledgments

D.C.B. is a fellow of the Berlin Institute of Health (BIH), supported by StiftungCharité.

Author Contributions

  1. Conceived and designed the experiments: DCB IW.
  2. Performed the experiments: DCB IW.
  3. Analyzed the data: IW UG DCB.
  4. Contributed reagents/materials/analysis tools: IW UG.
  5. Wrote the paper: DCB IW UG.

References

  1. 1. McMahon GT, Katz JT, Thorndike ME, Levy BD, Loscalzo J. Evaluation of a redesign initiative in an internal-medicine residency. N Engl J Med. 2010;362(14):1304–11. pmid:20375407
  2. 2. Baumgart DC. Personal digital assistants in health care: experienced clinicians in the palm of your hand? Lancet. 2005;366(9492):1210–22. pmid:16198770
  3. 3. Baumgart DC. Smartphones in clinical practice, medical education, and research. Arch Intern Med. 2011;171(14):1294–6. pmid:21788549
  4. 4. Payne KB, Wharrad H, Watts K. Smartphone and medical related App use among medical students and junior doctors in the United Kingdom (UK): a regional survey. BMC Med Inform Decis Mak. 2012;12:121. pmid:23110712
  5. 5. Smart NJ. A survey of smartphone and tablet computer use by colorectal surgeons in the UK and Continental Europe. Colorectal Dis. 2012;14(9):e535–8. pmid:22747977
  6. 6. Franko OI, Tirrell TF. Smartphone app use among medical providers in ACGME training programs. J Med Syst. 2012;36(5):3135–9. pmid:22052129
  7. 7. HewlettPackard. Overview HP Compaq tc4200 Tablet PC 22.02.2017. http://h20565.www2.hp.com/hpsc/swd/public/readIndex?sp4ts.oid=457860&lang=en&cc=us.
  8. 8. IBM(Lenovo). Overview ThinkPad X41 22.02.2017. http://pcsupport.lenovo.com/de/de/products/Laptops-and-netbooks/ThinkPad-X-Series-Tablet-laptops/ThinkPad-X41-Tablet
  9. 9. Nelson SJ, Cole WG, Tuttle MS, Olson NE, Sherertz DD. Recognizing new medical knowledge computationally. Proc Annu Symp Comput Appl Med Care. 1993:409–13. pmid:8130505
  10. 10. Nelson SJ, Olson NE, Fuller L, Tuttle MS, Cole WG, Sherertz DD. Identifying concepts in medical knowledge. Medinfo. 1995;8 Pt 1:33–6.
  11. 11. Huth EJ. MKSAP IV: a professional commitment. Ann Intern Med. 1976;85(5):675–6. pmid:984625
  12. 12. Burg FD, Grosse ME, Kay CF. A national self-assessment program in internal medicine. Ann Intern Med. 1979;90(1):100–7. pmid:420438
  13. 13. Meskauskas JA, Webster GD. The American Board of Internal Medicine recertification examination: process and results. Ann Intern Med. 1975;82(4):577–81. pmid:1119774
  14. 14. Ramsey PG, Carline JD, Inui TS, Larson EB, LoGerfo JP, Wenrich MD. Predictive validity of certification by the American Board of Internal Medicine. Ann Intern Med. 1989;110(9):719–26. pmid:2930109
  15. 15. Mann HB, Whitney DR. On a Test of Whether one of Two Random Variables is Stochastically Larger than the Other. Ann Math Statist. 1947;18(1):50–60.
  16. 16. Fisher RA. On the Interpretation of χ2 from Contingency Tables, and the Calculation of P. Journal of the Royal Statistical Society. 1922;85(1):87–94.
  17. 17. American College of Physicians. Performance Interpretation Guidelines with Norm Tables MKSAP® 14 22.02.2017. https://mksap.acponline.org/14/norm_tables_14.pdf.
  18. 18. Brateanu A, Yu C, Kattan MW, Olender J, Nielsen C. A nomogram to predict the probability of passing the American Board of Internal Medicine examination. Med Educ Online. 2012;17:18810. pmid:23078794
  19. 19. Rose SH, Long TR. Accreditation Council for Graduate Medical Education (ACGME) annual anesthesiology residency and fellowship program review: a "report card" model for continuous improvement. BMC Med Educ. 2010;10:13. pmid:20141641
  20. 20. Cassel CK, Jain SH. Assessing individual physician performance: does measurement suppress motivation? JAMA. 2012;307(24):2595–6. pmid:22735427
  21. 21. Carraccio C, Englander R, Adams D, Giudice E, Olsen AG. Is there a gap between program directors' expectations and residents' performance? Comparing predictions with outcomes for competence in patient care. Acad Med. 2010;85(7):1152–6. pmid:20592511
  22. 22. Kalet A, Ellaway RH, Song HS, Nick M, Sarpel U, Hopkins MA, et al. Factors influencing medical student attrition and their implications in a large multi-center randomized education trial. Adv Health Sci Educ Theory Pract. 2013;18(3):439–50. pmid:22869047
  23. 23. McCarney R, Warner J, Iliffe S, van Haselen R, Griffin M, Fisher P. The Hawthorne Effect: a randomised, controlled trial. BMC Med Res Methodol. 2007;7:30. pmid:17608932
  24. 24. Monahan T, Fisher JA. Benefits of "Observer Effects": Lessons from the Field. Qual Res. 2010;10(3):357–76. pmid:21297880
  25. 25. Shokar GS, Burdine RL, Callaway M, Bulik RJ. Relating student performance on a family medicine clerkship with completion of Web cases. Fam Med. 2005;37(9):620–2. pmid:16193415
  26. 26. Post RE, Jamena GP, Gamble JD. Using Precept-Assist(R) to predict performance on the American Board of Family Medicine In-Training Examination. Fam Med. 2014;46(8):603–7. pmid:25163038
  27. 27. McGrath J, Kman N, Danforth D, Bahner DP, Khandelwal S, Martin DR, et al. Virtual alternative to the oral examination for emergency medicine residents. West J Emerg Med. 2015;16(2):336–43. pmid:25834684
  28. 28. Miller CJ, Metz MJ. Can Clinical Scenario Videos Improve Dental Students' Perceptions of the Basic Sciences and Ability to Apply Content Knowledge? J Dent Educ. 2015;79(12):1452–60. pmid:26632300
  29. 29. McKinley DW, Hess BJ, Boulet JR, Lipner RS. Examining changes in certification/licensure requirements and the international medical graduate examinee pool. Adv Health Sci Educ Theory Pract. 2013.
  30. 30. Akl EA, Sackett KM, Erdley WS, Mustafa RA, Fiander M, Gabriel C, et al. Educational games for health professionals. Cochrane Database Syst Rev. 2013;1:CD006411.
  31. 31. Lottridge DM, Chignell M, Danicic-Mizdrak R, Pavlovic NJ, Kushniruk A, Straus SE. Group differences in physician responses to handheld presentation of clinical evidence: a verbal protocol analysis. BMC Med Inform Decis Mak. 2007;7:22. pmid:17655759
  32. 32. Holmboe ES, Hess BJ, Conforti LN, Lynn LA. Comparative trial of a web-based tool to improve the quality of care provided to older adults in residency clinics: modest success and a tough road ahead. Acad Med. 2012;87(5):627–34. pmid:22450173
  33. 33. Hess BJ, Weng W, Lynn LA, Holmboe ES, Lipner RS. Setting a fair performance standard for physicians' quality of patient care. J Gen Intern Med. 2011;26(5):467–73. pmid:21104453
  34. 34. Weifeng W, Hess BJ, Lynn LA, Holmboe ES, Lipner RS. Measuring physicians' performance in clinical practice: reliability, classification accuracy, and validity. Eval Health Prof. 2010;33(3):302–20. pmid:20801974