Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

The effect of electronic health record software design on resident documentation and compliance with evidence-based medicine

  • Yasaira Rodriguez Torres ,

    Roles Conceptualization, Data curation, Investigation, Methodology, Project administration, Supervision, Writing – original draft, Writing – review & editing

    yrodrigu@med.wayne.edu

    Affiliation Department of Ophthalmology, Kresge Eye Institute, Detroit, Michigan, United States of America

  • Jordan Huang,

    Roles Conceptualization, Data curation, Formal analysis, Methodology, Software, Validation, Writing – original draft, Writing – review & editing

    Affiliation School of Medicine, Wayne State University, Detroit, Michigan, United States of America

  • Melanie Mihlstin,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Validation, Writing – review & editing

    Affiliation Department of Ophthalmology, Kresge Eye Institute, Detroit, Michigan, United States of America

  • Mark S. Juzych,

    Roles Conceptualization, Data curation, Investigation, Methodology, Supervision, Writing – review & editing

    Affiliation Department of Ophthalmology, Kresge Eye Institute, Detroit, Michigan, United States of America

  • Heidi Kromrei,

    Roles Conceptualization, Data curation, Formal analysis, Methodology, Supervision, Writing – review & editing

    Affiliation Department of Ophthalmology, Kresge Eye Institute, Detroit, Michigan, United States of America

  • Frank S. Hwang

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Supervision, Writing – review & editing

    Affiliation Department of Ophthalmology, Kresge Eye Institute, Detroit, Michigan, United States of America

Abstract

This study aimed to determine the role of electronic health record software in resident education by evaluating documentation of 30 elements extracted from the American Academy of Ophthalmology Dry Eye Syndrome Preferred Practice Pattern. The Kresge Eye Institute transitioned to using electronic health record software in June 2013. We evaluated the charts of 331 patients examined in the resident ophthalmology clinic between September 1, 2011, and March 31, 2014, for an initial evaluation for dry eye syndrome. We compared documentation rates for the 30 evidence-based elements between electronic health record chart note templates among the ophthalmology residents. Overall, significant changes in documentation occurred when transitioning to a new version of the electronic health record software with average compliance ranging from 67.4% to 73.6% (p < 0.0005). Electronic Health Record A had high compliance (>90%) in 13 elements while Electronic Health Record B had high compliance (>90%) in 11 elements. The presence of dialog boxes was responsible for significant changes in documentation of adnexa, puncta, proptosis, skin examination, contact lens wear, and smoking exposure. Significant differences in documentation were correlated with electronic health record template design rather than individual resident or residents’ year in training. Our results show that electronic health record template design influences documentation across all resident years. Decreased documentation likely results from “mouse click fatigue” as residents had to access multiple dialog boxes to complete documentation. These findings highlight the importance of EHR template design to improve resident documentation and integration of evidence-based medicine into their clinical notes.

Introduction

The introduction of electronic health records (EHR) to graduate medical education has the potential to aid residency programs in complying with the competencies required by the Accreditation Council for Graduate Medical Education. The ability of an EHR system to provide access to comprehensive clinical data and to improve daily workflow by decreasing clerical tasks and monitoring residents’ learning experience was key in gaining its acceptance [15].

Despite decades of EHR implementation, our knowledge about their impact on residents’ learning outcomes remains limited. Most published articles consist of mixed positive and negative perceptions, anecdotes, and clinical experiences. The impact of EHR on residents’ clinical notes is mixed [610]. Previous studies have shown that transitioning to EHR can improve quality of clinical notes, yet its pre-formatted templates, copy–paste function, and auto-filled data have an unknown impact on residents’ clinical reasoning [11]. As the focus on EHR and physician quality reporting continues to expand, it is therefore imperative to evaluate the influence of EHR software design and its upgrades on resident education.

In this study, we aimed to measure the impact of EHR transition and software design on resident education. We used evidence-based elements provided by the American Academy of Ophthalmology (AAO) Preferred Practice Patterns (PPPs) for Dry Eye Syndrome (DES) to measure the quality of the clinical notes. Previous studies examining compliance with PPPs have shown that residents may be at risk for decreased compliance with evidence-based guidelines [1216]. Monitoring adherence to PPPs provides a guideline for quality patient care and an educational tool for resident education. The goal of our study was to determine if EHR software design played a role in residents’ documentation and compliance with the AAO PPPs for DES.

Methods

The Institutional Review Board at Wayne State University approved the study protocol. The Kresge Eye Institute (KEI) provided the setting for this study as we underwent a major EHR software upgrade in July of 2013. The EHR database was reviewed for adult patients 18 years old and older with an International Classification of Diseases, Ninth Revision, diagnosis of DES (375.15) examined in the KEI resident clinic between September 1, 2011, and March 31, 2014. Patient charts were excluded if they had a prior diagnosis of DES before September of 2011. One thousand three hundred and seven (1,307) patient charts were identified, and randomization was performed using Excel software (Microsoft, Redmond, WA). A total of 331 patient charts completed by a total of 17 resident physicians who evaluated patients were selected for this study. The KEI ophthalmology residency had seven residents in each year of a three-year ophthalmology residency. Residents were categorized based on postgraduate years (PGY); therefore, first-year residents are PGY2, second-year residents are PGY3, and third-year residents are PGY4. The patient charts selected for the study were evaluated for demographic data and documentation of 30 elements extracted from the AAO PPPs on DES (2011 edition) [12]. These elements were chosen based on their level of evidence and relevance to the diagnosis, treatment, and patient education of DES [12]. The elements were grouped into four sections that comprise the ophthalmological clinical notes: past medical history, physical exam, management, and patient education. The EHR software utilized in this study was NextGen; there was no auto-population or pre-population of elements in the EHR software. We labeled Electronic Health Record A (EHR-A) as the software used before June of 2013 and Electronic Health Record B (EHR-B) as the upgraded software thereafter.

Documentation of elements was collected from all the charts and then categorized as compliant, non-compliant, or not applicable. Information was determined to be not applicable for some cases; for example, “menopausal” only applied to women, and “referral given” only applied if systemic symptoms were present. Data were analyzed using SPSS (IBM, Armonk, NY) statistical analysis software version 21, and Excel version 2013 was used for data tabulation. Statistical methods included utilizing SPSS Generalized Multivariate Analysis of Variance (GMANOVA) and Excel. Average compliance and standard deviation were calculated through Excel analysis of the data and verified with SPSS to ensure accuracy. GMANOVA was used to examine each element as compliant or non-compliant between the major units of analysis, EHR-A and EHR-B. Statistical analysis was also performed for potential confounding variables such as individual resident compliance and resident level of training (i.e., PGY year) between EHR-A and EHR-B. A subset analysis using Unpaired Student T-test was used to evaluate independent factors between these groups, which included patient demographics (age, race, and gender).

Results

Descriptive results

A total of 331 patients with a diagnosis of DES in their charts were evaluated by a total of 17 ophthalmology residents, who evaluated patients in both EHR-A and EHR-B. Each resident evaluated an average of 19.47 ± 7.48 study visits. There were 213 charts in the EHR-A group and 118 in the EHR-B group. The mean age of the patients in EHR-A was 49.38 ± 17.09, and that in EHR-B was 49.69 ± 17.13 (p = 0.863). There was no significant difference in gender in this study (p = 0.676) with 228 male (68.88%; 146 in EHR-A; 82 in EHR-B) and 103 female participants (31.12%; 67 in EHR-A; 36 in EHR-B). Table 1 contains a summary of the differences in template design between EHR A and B.

thumbnail
Table 1. Differences in documentation methods of different chart note template versions.

https://doi.org/10.1371/journal.pone.0185052.t001

Resident compliance results

The mean compliance for the 30 elements was 69.62 ± 7.59% (n = 331) for all charts, 68.44 ± 6.42% (n = 146) for PGY2, 70.28 ± 7.48% (n = 63) for PGY3, and 70.69 ± 8.72% (n = 122) for PGY4. The total mean compliance rates between residency years were not statistically significant (p = 0.122). A summary of the results showing the documentation percentage for the 30 elements in each residency year is included in Table 2. Documentation rates were high (>90%) for 12 elements, which included 6 elements of past medical history, 5 of physical exam, and 1 of patient education. There was no statistically significant difference for any of the elements analyzed based on resident year or individual resident. There were no data recorded for menopause and cautioning patients that LASIK may worsen DES. However, none of the patients in the study were noted to be considering LASIK at the time of diagnosis with DES.

EHR-based compliance results

Charts were divided according to EHR-A or EHR-B, showing a significant change in mean compliance from EHR-A (n = 213) to EHR-B (n = 118) with an increase of 67.39 ± 6.20% for EHR-A to 75.63 ± 10.63% for EHR-B (p < 0.0005). A summary of the results showing the documentation percentage for the 30 elements per EHR is presented in Table 3. Overall, EHR-A had high compliance (>90%) in 13 elements while EHR-B had high compliance (>90%) in 11 elements. The transition from EHR-A to EHR-B led to a significant difference in documentation with an increase in documenting contact lens wear from 10.33 ± 30.50% to 56.41 ± 49.80% (p < 0.0005) as well as documenting adnexa, proptosis, and puncta, all increasing from 2.82 ± 16.58% to 82.20 ± 38.41% (p < 0.0005). Conversely, a significant decrease occurred in documentation of smoking exposure from 98.59 ± 11.81% to 78.81 ± 41.04% (p < 0.0005), presence of allergies from 95.31 ± 21.20% to 88.98 ± 31.44% (p = 0.033), and skin examination from 94.37 ± 23.11% to 55.93 ± 49.86% (p < 0.0005). The remaining elements did not show statistically significant changes between EHR.

thumbnail
Table 3. Documentation percentage of different chart note template versions.

https://doi.org/10.1371/journal.pone.0185052.t003

Additionally, charts were divided according to the residents’ years in training and compared between EHR chart note templates. A summary of the results showing differences in documentation is presented in S1 Table. When these changes were evaluated based on EHR template design, most remained statistically significant. The results show that significant changes in documentation were related not to the resident year in training but, rather, to EHR template design.

Discussion

Our study is unique as we measured the effects of EHR software design on resident documentation and compliance with evidence-based medicine. The results showed an increase in overall documentation from 67.42% in EHR-A to 73.57% in EHR-B (p < .001). One of the most noticeable changes with the upgrade from EHR-A to EHR-B was an increase in point-and-click EHR interfaces and dialog boxes. The addition of dialog boxes decreased the number of elements with high compliance (>90%). Overall, EHR-A had high compliance (>90%) in 13 elements while EHR-B had high compliance (>90%) in 11 elements. We postulate these changes were related to template design and the addition of dialog boxes. For example, both EHR-A and B chart note templates included fields for documentation of smoking exposure. Yet, after the software upgrade, EHR-B required the residents to open a separate dialog box under the History section to document smoking exposure. The presence of dialog boxes was responsible for significant changes in documentation of adnexa, puncta, proptosis, skin findings, contact lens use, and smoking exposure.

These findings highlight the importance of EHR software design and its influence on physician documentation. In our study, decreased documentation likely resulted from “mouse click fatigue” as residents had to access multiple dialog boxes. This phenomenon affected all residents regardless of the year in training. Conversely, documentation remained stable for elements in which the EHR template fields were readily available in both EHR-A and B. These findings are comparable to a study by Sanders et al. that reported a worsening in documentation time at their ophthalmology ambulatory clinic and operating room after EHR implementation due to the point-and-click EHR interfaces [17]. This phenomenon is not isolated to ophthalmology as a recent survey of physicians found generalized frustration with EHR on-screen boxes leading to modifications in EHR template design [18]. It is important to note that our observed mouse-clicking fatigue is very similar to alert fatigue, a phenomenon in which clinically insignificant reminders lead to a paradoxical increase in patient safety hazards. Therefore, EHR software design must be carefully evaluated for these phenomena as they can decrease physician compliance with standards of care [19].

Within residency training programs, monitoring compliance with guidelines ensures that residents provide quality patient care early in their careers and integrate evidence-based medicine into their practice [15, 20, 21]. Increasing compliance and minimizing EHR barriers are important as they can lead to fewer patient complications and possible reduction in overall cost of medical care [20, 22]. Academic institutions and program directors should be aware of barriers to appropriately plan for these major technological transitions and minimize adverse effects [23, 24]. Our study provided a measure of residents’ knowledge of DES and use of patient education and identified EHR barriers to delivering quality documentation. These results provide objective evidence that can aid in improving the quality of graduate medical education, which can subsequently result in direct improvement of patient care.

This study had several limitations. Our study was based on a single ophthalmology resident clinic at one academic institution; thus, the results may not be generalizable to other clinic settings. EHR software versions compared in the study are among many commercially available EHR programs, and our results in compliance may not be generalizable to other EHR software. This is a retrospective chart review; hence, data were limited to what was documented in the EHR. Lastly, the number of charts documented by individual residents was low, limiting the ability to detect significant differences in individual resident documentation. Similar to other published ophthalmology resident compliance studies, we are unable to correlate our findings with clinical outcomes. Therefore, future trials are needed to study the correlation between residents’ adherence to evidence-based guidelines and improved patient care.

Our study has several strengths. This study included a sample of 331 charts that spanned over four years and included 17 residents at different years of training. All the elements measured in our study are supported by a body of evidence and were extracted directly from the AAO PPPs on DES (2011 edition). This is the first study to demonstrate that EHR template design can significantly affect the quality of clinical notes documented by residents.

Conclusions

The content and quality of the EHR chart note template play important roles in guiding documentation. EHR design factors can be responsible for the success or failure of adherence to evidence-based guidelines [25, 26], Therefore, as the focus on EHR and physician quality reporting continues to expand, it becomes imperative to evaluate the influence of EHR software design and its upgrades on resident education. Our study shows that EHR software design does have a significant impact on the quality of the residents’ clinical note.

Before this study and other PPP-related studies performed in our clinic, there was limited emphasis on practice and implementation of PPP guidelines in our clinic. Awareness of the impact of EHR design and continued emphasis on PPPs have led to EHR modifications, the first being a link within the EHR template to AAO PPPs guidelines and references. Additionally, an educational workshop on PPPs was implemented last year, allowing residents to self-evaluate their compliance with PPPs and providing the opportunity for practice-based learning. Currently, our clinical transformation team is working on modifying documentation templates to include key elements from PPPs. Future studies will determine the impact of these interventions on compliance and how compliance relates to improvement of patient care. Additional studies will include coordination with EHR companies to create templates for graduate medical education that aid in developing resident competencies and achievement of milestones.

Supporting information

Acknowledgments

This project was funded in part by an unrestricted/challenge department grant to Kresge Eye Institute by Research to Prevent Blindness.

References

  1. 1. Batalden P, Leach D, Swing S, Dreyfus H, Dreyfus S. General competencies and accreditation in graduate medical education. Health Aff. 2002;21(5): 103–111.
  2. 2. Garg AX, Adhikari NK, McDonald H, Rosas-Arellano MP, Devereaux P, Beyene J, et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA. 2005;293(10): 1223–1238. pmid:15755945
  3. 3. Hier DB, Rothschild A, LeMaistre A, Keeler J. Differing faculty and housestaff acceptance of an electronic health record. Int J Med Inform. 2005;74(7): 657–662.
  4. 4. Van Eaton EG, Horvath KD, Lober WB, Rossini AJ, Pellegrini CA. A randomized, controlled trial evaluating the impact of a computerized rounding and sign-out system on continuity of care and resident work hours. J Am Coll Surg. 2005;200(4): 538–545. pmid:15804467
  5. 5. Sequist TD, Singh S, Pereira AG, Rusinak D, Pearson SD. Use of an electronic medical record to profile the continuity clinic experiences of primary care residents. Acad Med. 2005;80(4): 390–394. pmid:15793025
  6. 6. Schenarts PJ, Schenarts KD. Educational impact of the electronic medical record. J Surg Educ. 2012;69(1): 105–112. pmid:22208841
  7. 7. Keenan CR, Nguyen HH, Srinivasan M. Electronic medical records and their impact on resident and medical student education. Acad Psychiatry. 2006;30(6): 522–527. pmid:17139024
  8. 8. Tierney MJ, Pageler NM, Kahana M, Pantaleoni JL, Longhurst CA. Medical education in the electronic medical record (EMR) era: benefits, challenges, and future directions. Acad Med. 2013;88(6): 748–752. pmid:23619078
  9. 9. Peled JU, Sagher O, Morrow JB, Dobbie AE. Do electronic health records help or hinder medical education? PLoS Med. 2009;6(5): e1000069. pmid:19434294
  10. 10. Barnett JS. Incorporating electronic medical records into the physician assistant educational curriculum. J Physician Assist Educ. 2013;24(2): 48–54. pmid:23875499
  11. 11. Burke HB, Sessums LL, Hoang A, Becher DA, Fontelo P, Liu F, et al. Electronic health records improve clinical note quality. J Am Med Inform Assoc. 2014: amiajnl-2014-002726.
  12. 12. American Academy of Ophthalmology Cornea/External Disease Panel. Preferred practice pattern guidelines. Dry eye syndrome. San Francisco: American Academy of Ophthalmology; 2011.
  13. 13. Tseng VL, Greenberg PB, Scott IU, Anderson KL. Compliance with the American Academy of Ophthalmology Preferred Practice Pattern for Diabetic Retinopathy in a resident ophthalmology clinic. Retina. 2010;30(5): 787–794. pmid:20168268
  14. 14. Lin I-C, Gupta PK, Boehlke CS, Lee PP. Documentation of conformance to preferred practice patterns in caring for patients with dry eye. Arch Ophthalmol. 2010;128(5): 619–623. pmid:20457985
  15. 15. Niemiec ES, Anderson KL, Scott IU, Greenberg PB. Evidence-based management of resident-performed cataract surgery: an investigation of compliance with a preferred practice pattern. Ophthalmology. 2009;116(4): 678–684. pmid:19268367
  16. 16. Ong SS, Sanka K, Mettu PS, Brosnan TM, Stinnett SS, Lee PP, et al. Resident compliance with the American Academy of Ophthalmology preferred practice pattern guidelines for primary open-angle glaucoma. Ophthalmology. 2013;120(12): 2462–2469. pmid:23916487
  17. 17. Sanders DS, Read-Brown S, Tu DC, Lambert WE, Choi D, Almario BM, et al. Impact of an electronic health record operating room management system in ophthalmology on documentation time, surgical volume, and staffing. JAMA Ophthalmol. 2014;132(5): 586–592. pmid:24676217
  18. 18. Lowes R. EHR rankings hint at physician revolt. Medscape. 2014. http://www.medscape.com/viewarticle/820101?pa=WmItYeirt2YxQpAnU94VDrkE%2BB9NfL%2FKoQz0RsuaK1h%2FVZkZ25jMcO56VaPDQ1Oe43mU9jD%2B1DtnxY47OmyybA%3D%3D
  19. 19. Demakis JG, Beauchamp C, Cull WL, Denwood R, Eisen SA, Lofgren R, et al. Improving residents' compliance with standards of ambulatory care: results from the VA Cooperative Study on Computerized Reminders. JAMA. 2000;284(11): 1411–1416. pmid:10989404
  20. 20. Berg AO, Atkins D, Tierney W. Clinical practice guidelines in practice and education. J Gen Intern Med. 1997;12(s2): 25–33.
  21. 21. Epling J, Smucny J, Patil A, Tudiver F. Teaching evidence-based medicine skills through a residency-developed guideline. Fam Med—Kansas City. 2002;34(9): 646–648.
  22. 22. Field MJ. Setting priorities for clinical practice guidelines. Washington D.C., National Academies Press; 1995.
  23. 23. Boonstra A, Broekhuis M. Barriers to the acceptance of electronic medical records by physicians from systematic review to taxonomy and interventions. BMC Health Serv Res. 2010;10(1): 1.
  24. 24. Singh RP, Bedi R, Li A, Kulkarni S, Rodstrom T, Altus G, et al. The Practice impact of electronic health record system implementation within a large multispecialty ophthalmic practice. JAMA Ophthalmol. 2015;133(6): 668–674. pmid:25880083
  25. 25. Ahmed J, Mehmood S, Rehman S, Ilyas C, Khan L. Impact of a structured template and staff training on compliance and quality of clinical handover. Int J Surg. 2012;10(9): 571–574. pmid:22983018
  26. 26. Shiffman RN, Liaw Y, Brandt CA, Corb GJ. Computer-based guideline implementation systems. J Am Med Inform Assoc. 1999;6(2): 104–114 pmid:10094063