Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Physician communication coaching effects on patient experience

  • Adrianne Seiler ,

    Adrianne.seiler@bhs.org

    Affiliations Department of Medicine, Baystate Medical Center, Springfield, Massachusetts, United States of America, Department of Medicine, Tufts University School of Medicine, Boston, Massachusetts, United States of America, Baycare Health Partners/Pioneer Valley ACO, Springfield, Massachusetts, United States of America

  • Alexander Knee,

    Affiliations Department of Medicine, Tufts University School of Medicine, Boston, Massachusetts, United States of America, Office of Research, Baystate Medical Center, Springfield, Massachusetts, United States of America

  • Reham Shaaban,

    Affiliations Department of Medicine, Baystate Medical Center, Springfield, Massachusetts, United States of America, Department of Medicine, Tufts University School of Medicine, Boston, Massachusetts, United States of America

  • Christine Bryson,

    Affiliations Department of Medicine, Baystate Medical Center, Springfield, Massachusetts, United States of America, Department of Medicine, Tufts University School of Medicine, Boston, Massachusetts, United States of America

  • Jasmine Paadam,

    Affiliations Department of Medicine, Baystate Medical Center, Springfield, Massachusetts, United States of America, Department of Medicine, Tufts University School of Medicine, Boston, Massachusetts, United States of America

  • Rohini Harvey,

    Affiliations Department of Medicine, Baystate Medical Center, Springfield, Massachusetts, United States of America, Department of Medicine, Tufts University School of Medicine, Boston, Massachusetts, United States of America

  • Satoko Igarashi,

    Affiliations Department of Medicine, Baystate Medical Center, Springfield, Massachusetts, United States of America, Department of Medicine, Tufts University School of Medicine, Boston, Massachusetts, United States of America

  • Christopher LaChance,

    Affiliations Department of Medicine, Baystate Medical Center, Springfield, Massachusetts, United States of America, Department of Medicine, Tufts University School of Medicine, Boston, Massachusetts, United States of America

  • Evan Benjamin,

    Affiliations Department of Medicine, Baystate Medical Center, Springfield, Massachusetts, United States of America, Department of Medicine, Tufts University School of Medicine, Boston, Massachusetts, United States of America, Center for Quality of Care Research, Baystate Medical Center, Springfield, Massachusetts, United States of America, Department of Healthcare Quality, Baystate Medical Center, Springfield, Massachusetts, United States of America

  • Tara Lagu

    Affiliations Department of Medicine, Baystate Medical Center, Springfield, Massachusetts, United States of America, Department of Medicine, Tufts University School of Medicine, Boston, Massachusetts, United States of America, Center for Quality of Care Research, Baystate Medical Center, Springfield, Massachusetts, United States of America, Baystate Health-University of Massachusetts Medical School, Springfield, Massachusetts, United States of America

Abstract

Background

Excellent communication is a necessary component of high-quality health care. We aimed to determine whether a training module could improve patients’ perceptions of physician communication behaviors, as measured by change over time in domains of patient experience scores related to physician communication.

Study design

We designed a comprehensive physician-training module focused on improving specific “etiquette-based” physician communication skills through standardized simulations and physician coaching with structured feedback. We employed a quasi-experimental pre-post design, with an intervention group consisting of internal medicine hospitalists and residents and a control group consisting of surgeons. The outcome was percent “always” scores for questions related to patients’ perceptions of physician communication using the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey and a Non-HCAHPS Physician-Specific Patient Experience Survey (NHPPES) administered to patients cared for by hospitalists.

Results

A total of 128 physicians participated in the simulation. Responses from 5020 patients were analyzed using HCAHPS survey data and 1990 patients using NHPPES survey data. The intercept shift, or the degree of change from pre-intervention percent “always” responses, for the HCAHPS questions of doctors “treating patients with courtesy” “explaining things in a way patients could understand,” and “overall teamwork” showed no significant differences between surgical control and hospitalist intervention patients. Adjusted NHPPES percent excellent survey results increased significantly post-intervention for the questions of specified individual doctors “keeping patient informed” (adjusted intercept shift 9.9% P = 0.019), “overall teamwork” (adjusted intercept shift 11%, P = 0.037), and “using words the patient could understand” (adjusted intercept shift 14.8%, p = 0.001).

Conclusion

A simulation based physician communication coaching method focused on specific “etiquette-based” communication behaviors through a deliberate practice framework was not associated with significantly improved HCAHPS physician communication patient experience scores. Further research could reveal ways that this model affects patients’ perceptions of physician communication relating to specific physicians or behaviors.

Introduction

Patient experience is an important metric for measuring hospital performance. Since 2012, the Centers for Medicare and Medicaid Services (CMS) have tied patient experience to reimbursement through the Value Based Purchasing (VBP) program. In 2016, 1.75% of a hospital’s Diagnosis Related Group (DRG) base operating payment is at stake and 30% of this payment is linked to a hospital’s patient experience scores.[1] Commercial payers are also linking patient experience outcomes to value-based payments, and many physician groups include patient experience as a metric for physicians’ variable compensation.[2] This increased scrutiny is appropriate, as multiple studies have shown a strong correlation between the quality of physician communication and the quality of clinical care.[38] Moreover, many problems with the effective delivery of health care can be attributed to ineffective communication between patient and provider.[9] Practically speaking, good physician communication will be a requirement for hospitals to sustain reimbursement at current levels. Yet there remains a deficit of evidenced-based interventions that lead to improvement of both patient perceptions of these communication skills and overall patient experience.[1016]

Medical simulation is a validated method of teaching and improving clinical communication abilities. [1625] Systematic review of clinician communication courses finds an effective approach combines didactic components with simulation with skilled feedback.[24] Programs based upon models of experiential learning and deliberate practice (learning focused on repetitive performance of specified skills with specific feedback)[26] succeed in teaching communications skills and changing clinician behavior. [17,18,25,27,28]

Improvement on clinician communication skills is a prerequisite to improving hospital care. Previous studies show only 10–32% of inpatients can correctly name their physicians, fewer (11%) can explain their physicians’ role in the care that they are receiving, and few can understand the key elements of their hospital plan.[12,29,30] Given these findings, a critical realm to focus improvement on clinician communication relates to basic “etiquette-based medicine” behavior such as knocking on a patient’s door, asking to enter the patient’s room, introducing oneself, sitting down in the patients’ room, and explaining one’s role in care. One cross sectional observational study on such behaviors found that one-third of physicians (30%) performed zero of the six “etiquette-based” behaviors and a majority (56%) did explain their roles to patients, despite the fact that a positive association was found between performance of the behaviors and patient experience scores. [12]

Yet evidence is mixed as to whether clinician communication skills training has an effect on patient satisfaction,[13,14,31] with most interventions showing no effect.[10,13,15,16,32] We hypothesized that using an evidenced-based framework of clinical simulation in a deliberate practice model targeted at teaching simple, discrete physician communication behaviors with structured feedback would improve patients’ perceptions of physician communication and reported experience of care. We implemented such a model in a hospital medicine division at a single academic center and examined its impact on outcomes using two systematically administered patient experience surveys, focusing on domains relevant to physician communication. We compared trends in patient experience scores between medical patients and, where possible, control surgical patients (whose physicians did not undergo the intervention) during the same time period. The control group was included to account for hospital-wide patient experience initiatives occurring concurrent to our intervention time-period.

Methods

Setting

Our study was conducted at Baystate Medical Center (BMC), a 716-bed tertiary academic care center in Springfield, MA between January 1, 2011 and December 31st, 2013. The hospital has an employed hospitalist group with 50 attending physicians and midlevel hospitalists, 64 internal medicine, and 30 medicine-pediatric residents. The group is separated into two distinct parts, an academic group and a group without resident coverage. Both services care for all adult medicine inpatients and see similar medical patients regardless of insurer, primary care physician, admission diagnosis, or medical comorbidities. Our hospital has surgical services supplied by attending surgical physicians alone as well as by attending supervised surgical house staff, with approximately 75 attending surgeons and 34 surgical residents. Baystate Health’s Institutional Review Board approved this study with a waiver of written consent.

Study design and intervention

We analyzed two different samples. CMS measures patient experience using a validated tool called the HCAPHs survey.[33] Our primary analysis examined the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) physician communication domain, which assesses patients’ perceptions of all of their hospital physicians’ (e.g., hospitalists, consultants, and trainees) collective communication abilities during the admission. For this analysis, we employed a quasi-experimental design with an intervention group consisting of internists, including hospitalists and residents, and a control group consisting of surgeons. Since HCAHPS data assesses patient perceptions of physician communication not specific to any one clinician, we hypothesized it may not be sensitive enough to assess the effect of a communication intervention focused only on a patient’s hospitalist physician. Therefore, we completed a secondary analysis with a pre-post design utilizing Non-HCAHPS Physician-Specific Patient Experience Survey (NHPPES) data that assessed patient’s perceptions of their specific hospitalist physician’s communication (this survey identifies patient’s individual physician by name, see Survey Questions in S1 Appendix. We were unable to formulate a control group for this secondary sample, as only patients cared for by hospitalists complete the survey.

The intervention was a 45-minute comprehensive physician-training module based on a deliberate practice learning method and proven effective methods of teaching clinician communication skills. The module focused on improving specific “etiquette-based” physician communication skills utilizing simulation and physician coaching with structured feedback. It consisted of a 10-minute didactic lecture highlighting current hospital HCAHPS data results, the rationale behind and importance of performing specific “etiquette-based”[12] basic communication techniques based on the Studor Group’s AIDET® mnemonic,[34] and a structured critique of the performance of these skills utilizing a pre-recorded simulated patient encounter. The structured AIDET-based communication skills specifically targeted were: Acknowledgement (greeting the patient by name, making eye contact), Introductions (introducing oneself by name and clinical role), Duration (giving accurate time expectation for tests and care), Explanations (deliberately explaining what to expect next, patient’s plan of care, and answering questions), and Thank you (appropriately closing the clinical encounter)[34] as well as other non-verbal communication behaviors (body language, facial expressions). Next, individual physicians participated in case—based simulated encounters (approximately 35 minutes in length) focused on the structured communication skills and non-verbal communication behaviors. Prior to participation, each physician received a standardized case scenario (See S2 Appendix) prospectively highlighting the specific skills being assessed, while also varying in complexity according to level of training (Resident versus Attending). The case scenarios broke the simulated patient encounter into three distinct parts: The Welcome, The Care Plan, and The Goodbye. Specific skill based feedback was provided after each part. We undertook a “train-the-trainer” approach where the physician communication champions and the simulated patients underwent a four-hour pre-training of didactic education combined with case-based standardized simulation and feedback on the specific skills of etiquette-based clinician communication and how to provide effective feedback. Personalized physician feedback for each patient encounter was given by both the physician communication champion as well as by the simulated patient after each section of the clinical encounter. A standardized assessment tool (S3 Appendix) was utilized to formulate the focused, structured, personalized, and constructive evaluation and feedback that specifically targeted physician communication etiquette and non-verbal physician behaviors. Of note, other ongoing quality improvement strategies targeted at physicians, including sharing of HCAHPS scores and patient experience email newsletters, continued both before and after the study period.

Outcome, survey instruments and data collection

The outcome was patient experience scores, collected from adult inpatients admitted before, during, and after the study time period. We collected HCAHPS survey data on both medical and surgical inpatients.

During the study period, Baystate Health used a third party vendor, Professional Research Consultants Inc. (PRC) to administer both the HCAHPS and the NHPPES patient satisfaction telephone surveys to a random sample of discharged adult inpatients. Approximately 50 HCAHPS surveys per quarter, per hospital floor, were conducted and 20 NHPPES surveys per hospitalist per year were conducted among patients not selected for the HCAHPS study. The NHPPES question design included the name of the specific discharging physician who cared for the patient being surveyed and addressed specifics regarding that physician’s care. We limited our analysis to domains reflecting satisfaction with physician communication or those potentially influenced by improved physician communication behaviors.

Survey data were collected for hospital quality and reporting purposes independent of our study. Because the NHPPES survey was only administered to adult medicine inpatients, we collected only HCAHPS data for the control group. The survey responses were scored, depending on question type, with: never, sometimes, usually, always (HCAHPS); or excellent, very good, good, fair, poor (NHPPES). (S1 Appendix)

Additional patient information for respondents was extracted from the hospital’s billing database using medical account numbers and included age, gender, admission year, education level, language, illness severity (the Diagnosis-Related Group severity score, emergency room (ER) admission status, and attending physician type (hospitalist or surgeon). It was not possible to distinguish whether patients were cared for by house staff under an attending physician (medical orders are primarily written by the house staff but patient is rounded on by the entire medical team of house staff and attending) or solely by an attending physician who writes all medical orders and is introduced as the “primary physician”.

Data analysis

Respondent characteristics were summarized across study groups and time periods. We summarized continuous variables using means and standard deviations and frequencies and percentages to summarize categorical variables. We evaluated differences across groups based on observed differences and a theoretical basis for confounding (association with both the exposure and the outcome). Survey responses were dichotomized into percent excellent (or percent always) and analyzed using a piecewise logistic regression model. All models used a clustered sandwich estimator, clustering on billing physician to relax the assumption of independence of observations.[35] We used piecewise models to estimate a slope and intercept before and after the intervention period with results presented graphically as well as estimates of the average marginal effects (estimated percentages). For the HCAHPS analysis, we used significance testing to evaluate the pre-to-post intercept shift difference-in-difference (interaction term between group and time period). For the NHPPES analysis, we conducted pre-to-post significance testing using intercept shift only. For the primary HCAHPS analysis, initial power calculations suggested that a 0% to 1% pre to post change in the surgery group would give us approximately 80% power to detect a 0.55% to 1.25% difference-in-difference with a two-sided alpha of 0.05. Significance testing was intended to be exploratory in nature; therefore no adjustments for multiple comparisons were made. Multivariable models for both outcomes evaluated age, gender, race, marital status, severity of the patient, preferred language, and admission through the emergency department. The HCAHPS analysis also evaluated education level, overall health and discharge disposition. Models were simplified to contain variables that were significant at the 0.05 level with Wald tests conducted between full and reduced models. Variables that were excluded in this process were entered back into the model, one at a time, and variables that caused approximately a 10% change in the coefficient of interest were retained in the final model. Statistical analysis was conducted using Stata v13.1, StataCorp LP, College Station, TX.

Results

Of 50 hospitalists, 42 (84%) participated in the physician communication coaching simulation. Of 94 resident physicians, 86 (90%) participated. Hospitalists were 47% female and 53% male, and nearly half (48%) had 0–3 years of attending experience. Only 16% had greater than 10 years experience. The surgical attending clinicians were 16% female and 84% male; we did not have information on years of experience for surgical attendings. The survey participation rate for medicine and surgical patients in the pre-intervention HCAHPS cohort was 31% and 46% and post-intervention 30% and 39%, respectively. The NHPPES physician specific survey participation rate was 22% both pre and post intervention periods.

The HCAHPS patient sample included 5020 patients surveyed (Table 1). The pre-intervention cohort (3720 patients) was surveyed between January 2011 and April 2013 and the post-intervention cohort (1300 patients) was surveyed between June 2013 and January 2014. 33.4% of patients were in the surgical control cohort and 66.7% in the hospitalist cohort. Hospitalist and surgical HCAHPS surveyed patients had similar baseline characteristics. On average, hospitalist patients were slightly older (62.5 years versus 58.5 years), had slightly lower proportion male (46.9% versus 53.2%), lower proportion white (82.8% versus 89.1%), lower proportion with a college degree (16.4% versus 26.3%), and lower proportion English spoken at home (85% versus 93%). Additionally, hospitalist patients were admitted through the ED more often (87.9% versus 27.5%) and had greater disease burden (8.8% versus 4.1% with severity of illness score of 4) but were equally likely to be discharged home or home with services. The HCAHPS hospitalist pre and post-intervention cohorts were similar in all baseline characteristics.

thumbnail
Table 1. HCAHPS patient characteristics by provider group over time period.

https://doi.org/10.1371/journal.pone.0180294.t001

Pre and post-intervention patients in the NHPPES survey cohort were similar (Table 2). However, post-intervention patients were slightly younger (58.7 years versus 63 years), had a slightly higher proportion male (53.4% versus 48.4%), were less severely ill (37.8% versus 47.6% with severity of illness score of 3 or 4), and were more likely to have the interview conducted in Spanish (7.7% versus 2.9%).

After the intervention, the intercept shift for percent “always” (degree of change from pre-intervention percent excellent responses) for HCAHPS questions of “treating patients with courtesy”, “explaining things in a way patients could understand”, and “overall teamwork” showed no significant differences between surgical control and hospitalist intervention patients either before or after adjustment (adjusted p = 0.899, p = 0.890, p = 0.438, respectively) (Table 3).

thumbnail
Table 3. HCAHPS intercept shift by provider group over time period.

https://doi.org/10.1371/journal.pone.0180294.t003

NHPPES survey adjusted and unadjusted percent excellent responses increased significantly for the questions of “keeping patient informed” (adjusted intercept shift 9.9% P = 0.019), “overall teamwork” (adjusted intercept shift 11%, P = 0.037), and “using words the patient could understand” (adjusted intercept shift 14.8%, p = 0.001) (Table 4). Post-intervention percent excellent responses increased in the questions addressing “physicians’ explanations of treatments” (adjusted intercept shift 6.9%, p = 0.210) and “treating patients with courtesy” (adjusted intercept shift 3.3%, p = 0.453) but were not statistically significant. Fig 1 shows post-intervention percent excellent responses for all questions analyzed trended towards pre-intervention levels over time (Fig 1).

thumbnail
Table 4. Non-HCAHPS physician specific patient experience survey (NHPPES) intercept shift by time period.

https://doi.org/10.1371/journal.pone.0180294.t004

thumbnail
Fig 1. HCAHPS trends in “Percent Always” and NHPPES trends in “Percent Excellent”

https://doi.org/10.1371/journal.pone.0180294.g001

Discussion

A hospitalist and IM resident physician communication simulation-based coaching method focused on specific “etiquette-based” communication behaviors through a deliberate practice model with structured feedback was not associated with significantly improved HCAHPS physician communication patient experience scores.

The lack of significant improvement on HCAHPS scores is similar to results from prior physician communication training programs with most showing no effect. [10,13,15,16,32] O’Leary et al. (2013) implemented a three session physician communication skill training program based on AIDET principles for 61 hospitalist physicians in an academic medical center but found no significant improvements in HCAHPS doctor communication domains.[11] However, the O’Leary study also notably showed no significant pre-post improvement in non-HCAHPS (Press Ganey) physician communication questions, while our intervention showed significant improvement in some physician communication questions from a similar non-HCAHPS (NHPPES) patient experience tool. Similarly, a systematic review of physician communication trainings focused on shared decision making showed that only 40% of the studies included showed improve patient satisfaction when providers were trained on communication skills.[13]

Why, then, were there no significant changes in the physician communication domains of HCAHPS scores were seen compared to surgical controls, but our secondary analysis did reveal improvement in similar domains? There are several possible explanations for these findings. First, HCAHPS scores were designed to assess and evaluate patient satisfaction performance in aggregate, at the hospital or system level. [33,36] Although often erroneously used to describe the performance of individual physicians or groups of physicians, our results may illustrate that the HCAHPS survey was not designed to be an evaluation tool of individual provider patient satisfaction performance.[37] In fact, in an analysis of 420 patients admitted to a hospitalist medicine service, the discharging hospitalist accounted for only 34% of all physician encounters. [38] Notably, despite other physicians accounting for the majority of patient care encounters, most performance improvement analyses would attribute HCAHPS outcome data to the discharging hospitalist. Research also shows specialist physicians strongly influence patients overall perceptions of physicians.[39] We also do not know if, for example, hospitalist communication improved but trainees did not similarly benefit. These factors make it difficult to accurately assess the association between HCAHPS physician communication outcomes to individual physician communication behavior or physician communication interventions. A second possible explanation for the difference between HCAHPS and NHPPES results is that fact the HCAHPS analysis examined difference over time compared to control group while the secondary analysis used a weaker, pre-post design (because we lacked an adequate control group). Therefore, while our secondary analysis of NHPPES data suggests that our intervention may have had an effect, we can only conclude that further controlled studies assessing our intervention using a physician specific patient experience survey tool (not HCAHPS) are needed.

Interestingly, despite the limitation of utilizing HCAHPS to access the effectiveness of local level QI activities, a small, single center, non-randomized IM resident education program focused on improving physician related HCAHPS scores did show significant improvements. This intervention linked resident didactic education to individualized patient experience score feedback, monthly recognitions, and tangible incentives.[31] Similarly, our secondary analysis showed improvement in patient reported “excellence” post-intervention, with 10% to 15% increases in percent “excellent” for questions related to keeping patients informed, teamwork, and using understandable patient language. However, the “bump” observed was short-lived, showing physician communication reverts to its prior state after a single interventional period. Unsurprisingly, this suggests personalized physician communication coaching may have some immediate effects that diminish without deliberate reinforcement. This conclusion is supported by Banka et al.’s (2015) study finding significant improvement in HCAHPS scores during their physician communication intervention since they attributed the improvements to ongoing reinforcement activities such as patient satisfaction score feedback, monthly recognition, and incentives for high patient-satisfaction scores.[31] One possible explanation for this finding is that regression towards the mean, a common phenomenon in improvement work requiring behavioral change, occurs if concerted efforts are not made to maintain the educational gain. This leads us to conclude that no single intervention is enough to maintain behavioral improvements over time. Instead, continual reinforcement, perhaps through ongoing simulation training and revisiting the specified skills gained in meetings and through emails, is likely critical to maintain meaningful improvements. [40] However, our intervention represents a typical “real-world” operational quality improvement project where an initial strategy is rolled out but, often due to resource limitations, a similar maintenance strategy is not undertaken. Our study highlights to health systems the critical nature of the “maintenance” phase when determining operational strategies aimed at behavior change.

We believe our intervention, while offering a model of physician communication education and coaching based on evidenced based learning methods, may have also lacked critical change management techniques of real-time data feedback, tangible behavioral incentives, and reinforcement of the specified communication skills. Unsurprisingly, systematic review shows multi-faceted interventions appear to be most effective in quality improvement.[13] However, deciphering which element(s) contribute most to the effect is difficult. Understanding which elements are key in a multi-faceted intervention is critical when adapting successful approaches to health systems with different goals.[13] Therefore, we designed our study to examine the effects of a single intervention. Unfortunately, a more messy “real world” bundled approach (i.e. a “physician communication engagement bundle”) of tactics may be necessary to see significant improvement in HCAHPS scores even if it becomes difficult to decipher which element had the most return on investment. Interestingly, more is not always better. A systematic review of 43 randomized study interventions on physician communication trainings focused on shared decision making found that short-term training (less than 10 hours) is equivalent to longer training.[13] This suggests that some repetition is necessary but that “a lot” more may not be necessary in regards to enhancing physician communication skills.

Educational interventions on physician communication show limited effect on patient experience. [10,1316,31,32] However, studies do show the performance of basic, structured communication behaviors has an association with patient experience outcomes.[12,41] Our results, combined with prior research, still do not fully quantify the utility of structured/scripted communication methodologies as a means to improve patient reported satisfaction with physician communication.[11,4145] Evidence of scripted communication education improving patient satisfaction remains scarce with most studies being very limited in scope and methodology.[4145] However, these basic, yet critical skills of “etiquette based” physician-patient communication are rarely done [12] and can lead to serious communication gaps between physicians and their patients. [30]

One small pilot study of 246 emergency room encounters by medical students found the students infrequently (0.4% of encounters) use all targeted communication elements (AIDET framework) but that the use of certain elements (acknowledging a patient by name, explaining that other providers would see the patient) was associated with an increase in patient satisfaction. [41] However, while the utility of scripted patient communication tools is of indeterminate significance on the patient experience, there is ample evidence supporting the educational utility of the learning framework of our intervention. The use of simulation to teach clinical communication skills is well supported in the literature [1625], as is the utility of teaching communication skills using a model of deliberate practice with structured feedback. [17,18,25,27] We believe our intervention’s communication skills education based on the structured AIDET mnemonic combined with focused feedback mirrors a model of deliberate practice. The structure of the education is valid since a meta-analysis found simulation based medical education with deliberate practice is superior to traditional clinical medical education in achieving specific clinical skill acquisition goal.[46] Likewise, focusing the educational framework on structured “etiquette-based” clinical communication skills is necessary since these behaviors are associated with higher patient experience scores.[12,41] Further research is necessary to fully evaluate the results of the educational framework (utilizing both simulation as well as deliberate practice) on patient reported outcomes. For example, qualitative studies could examine the aspects of communication behaviors that physicians perceived were changed by the training. Controlled studies need to assess our model’s learning framework on a physician specific patient experience. Furthermore, while much research examines the effects of improved “complex” clinician communication skills (such as goals of care discussions) has on the patient experience,[16,20,47] researchers should continue to test the effects on patient experience of the most basic “etiquette-based” elements of physician communication behavior.

Our study has several limitations. We conducted the intervention at a single academic hospital, limiting its generalizability. However, our secondary findings suggest that similar (but repeated) interventions should be studied in more sites, which would allow for control and intervention sites. Most patients who completed surveys were discharged home due to survey administration methods via phone. Similarly, we were unable to assess patient-level differences between survey respondents versus non-respondents. We were unable to determine the number of surveys administered to the patients of each physician. These factors limit generalizability of the study results. However, these limitations are prevalent in most studies utilizing hospital patient satisfaction survey results. The NHPPES survey was administered to only hospitalist patients and we did not have a comparison group for these patients. This poses an avenue ripe for further study. Our study was observational, thus subject to selection bias and confounding. We controlled for identifiable confounders such as illness severity, education, health status, etc. but it is possible that additional unidentified factors could have affected our results. Additionally, we were unable to control for the fact that non-participating hospital medicine physicians did provide some level of hospital care for some of the surveyed patients. This may have minimized our intervention’s significant effect size. We are limited by our use of post-hospitalization surveys, which tend to have a low response rate. Our response rate of 31% for HCAHPS patients is similar to the 2013 HCAHPS national average of 33%.[48] However the NHPPES response rates were appeared to be slightly lower at 22%, which could theoretically introduce bias to our results. Finally, we were limited in that we did not conduct qualitative interviews and did not use other methods to determine how the intervention changed physician behavior. However, we plan to continue to assess this in future studies.

Our research shows the limited utility of a simulation-based, deliberate practice physician communication coaching method with structured feedback to improve physician-related HCAHPS scores. It also highlights the limitations of using HCAHPS to quantify the effects of physician-level patient experience improvement strategies. Secondary analysis results demonstrate potential merit for the educational method focused on “etiquette-based” communication behaviors. Further study of this educational method utilizing controls and a period of behavioral reinforcement tactics (physician communication bundle) needs to be undertaken. Additional research needs to identify validated tools to assess patient perceptions of physician communication as it relates to local physician level quality improvement activities. [37] Without easily executed and validated tools to assess patient experience outcomes at a more granular level, we will never be able to design, implement, achieve and reinforce meaningful change in this arena. Finally, our results suggest that a broader “bundled approach” of multi-faceted physician communication improvement tactics with maintenance strategies may yield more consistent and long-lasting effects.

Supporting information

S1 Appendix. Patient Experience Analysis Questions.

https://doi.org/10.1371/journal.pone.0180294.s001

(DOCX)

S2 Appendix. Observed Structured Clinical Encounter Scenarios.

https://doi.org/10.1371/journal.pone.0180294.s002

(DOCX)

S3 Appendix. Physician Communication Skills Coaching and Assessment Tool.

https://doi.org/10.1371/journal.pone.0180294.s003

(DOC)

Acknowledgments

We would like to acknowledge the following for their contribution to the data analysis: Alice Ehresman, RN MBA, Michael Ehresman, MBA, Richard Brzostek

We would like to acknowledge the following for their contribution to the study design: John Brooling, MD, Behdad Besharatian, MD, Eimer Kitt, MD, Prasanna Durariraj, MD, Dev Basu, MD, Wei Boon Ooi, MD

Author Contributions

  1. Conceptualization: AS TL AK.
  2. Data curation: Ak AS TL.
  3. Formal analysis: AK.
  4. Funding acquisition: TL AS.
  5. Investigation: AS AK RS CB CL JP RH SI EB.
  6. Methodology: AS AK RS CB CL JP RH SI EB TL.
  7. Project administration: AS EB TL.
  8. Resources: AS EB.
  9. Software: AS AK EB TL.
  10. Supervision: AS EB TL.
  11. Visualization: AK.
  12. Writing – original draft: AS.
  13. Writing – review & editing: AS AK RS CB CL JP RH SI EB TL.

References

  1. 1. FY-2013-Program-Frequently-Asked-Questions-about-Hospital-VBP-3-9-12.pdf [Internet]. http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/hospital-value-based-purchasing/Downloads/FY-2013-Program-Frequently-Asked-Questions-about-Hospital-VBP-3-9-12.pdf
  2. 2. Lindenauer PK, Lagu T, Ross JS, Pekow PS, Shatz A, Hannon N, et al. Attitudes of hospital leaders toward publicly reported measures of health care quality. JAMA Intern Med. 2014;174: 1904–1911. pmid:25286316
  3. 3. Doyle C, Lennox L, Bell D. A systematic review of evidence on the links between patient experience and clinical safety and effectiveness. BMJ Open. 2013;3. pmid:23293244
  4. 4. Sequist TD, Glahn TV, Li A, Rogers WH, Safran DG. Measuring chronic care delivery: patient experiences and clinical performance. Int J Qual Health Care. 2012;24: 206–213. pmid:22490300
  5. 5. Sequist TD, Schneider EC, Anastario M, Odigie EG, Marshall R, Rogers WH, et al. Quality Monitoring of Physicians: Linking Patients’ Experiences of Care to Clinical Quality and Outcomes. J Gen Intern Med. 2008;23: 1784–1790. pmid:18752026
  6. 6. Jha AK, Orav EJ, Zheng J, Epstein AM. Patients’ Perception of Hospital Care in the United States. N Engl J Med. 2008;359: 1921–1931. pmid:18971493
  7. 7. Jackson CA, Clatworthy J, Robinson A, Horne R. Factors associated with non-adherence to oral medication for inflammatory bowel disease: a systematic review. Am J Gastroenterol. 2010;105: 525–539. pmid:19997092
  8. 8. Isaac T, Zaslavsky AM, Cleary PD, Landon BE. The relationship between patients’ perception of care and measures of hospital quality and safety. Health Serv Res. 2010;45: 1024–1040. pmid:20528990
  9. 9. Teutsch C. Patient-doctor communication. Med Clin North Am. 2003;87: 1115–1145. pmid:14621334
  10. 10. Brown JB, Boles M, Mullooly JP, Levinson W. Effect of clinician communication skills training on patient satisfaction. A randomized, controlled trial. Ann Intern Med. 1999;131: 822–829. pmid:10610626
  11. 11. O’Leary KJ, Darling TA, Rauworth J, Williams MV. Impact of hospitalist communication-skills training on patient-satisfaction scores. J Hosp Med. 2013;8: 315–320. pmid:23554016
  12. 12. Tackett S, Tad-y D, Rios R, Kisuule F, Wright S. Appraising the practice of etiquette-based medicine in the inpatient setting. J Gen Intern Med. 2013;28: 908–913. pmid:23423452
  13. 13. Dwamena F, Holmes-Rovner M, Gaulden CM, Jorgenson S, Sadigh G, Sikorskii A, et al. Interventions for providers to promote a patient-centred approach in clinical consultations. Cochrane Database of Systematic Reviews. John Wiley & Sons, Ltd; 2012. http://onlinelibrary.wiley.com/doi/10.1002/14651858.CD003267.pub2/abstract
  14. 14. Raper SE, Gupta M, Okusanya O, Morris JB. Improving Communication Skills: A Course for Academic Medical Center Surgery Residents and Faculty. J Surg Educ. 72: e202–e211. pmid:26183787
  15. 15. Zabar S, Hanley K, Stevens DL, Ciotoli C, Hsieh A, Griesser C, et al. Can interactive skills-based seminars with standardized patients enhance clinicians’ prevention skills? Measuring the impact of a CME program. Patient Educ Couns. 2010;80: 248–252. pmid:20053518
  16. 16. Curtis JR, Back AL, Ford DW, Downey L, Shannon SE, Doorenbos AZ, et al. Effect of communication skills training for residents and nurse practitioners on quality of communication with patients with serious illness: a randomized trial. JAMA. 2013;310: 2271–2281. pmid:24302090
  17. 17. Gelfman LP, Lindenberger E, Fernandez H, Goldberg GR, Lim BB, Litrivis E, et al. The Effectiveness of the Geritalk Communication Skills Course: A Real-Time Assessment of Skill Acquisition and Deliberate Practice. J Pain Symptom Manage. 2014;48: 738–744.e6. pmid:24681183
  18. 18. Hawkins RE, Margolis MJ, Durning SJ, Norcini JJ. Constructing a validity argument for the mini-Clinical Evaluation Exercise: a review of the research. Acad Med J Assoc Am Med Coll. 2010;85: 1453–1461. pmid:20736673
  19. 19. Okuda Y, Bryson EO, DeMaria S, Jacobson L, Quinones J, Shen B, et al. The utility of simulation in medical education: what is the evidence? Mt Sinai J Med N Y. 2009;76: 330–343. pmid:19642147
  20. 20. Jackson VA, Back AL. Teaching Communication Skills Using Role-Play: An Experience-Based Guide for Educators. J Palliat Med. 2011;14: 775–780. pmid:21651366
  21. 21. Maguire P, Fairbairn S, Fletcher C. Consultation skills of young doctors: I—Benefits of feedback training in interviewing as students persist. Br Med J Clin Res Ed. 1986;292: 1573–1576. pmid:3719282
  22. 22. Fallowfield L, Lipkin M, Hall A. Teaching senior oncologists communication skills: results from phase I of a comprehensive longitudinal program in the United Kingdom. J Clin Oncol Off J Am Soc Clin Oncol. 1998;16: 1961–1968. pmid:9586916
  23. 23. Back AL, Arnold RM, Baile WF, Tulsky JA, Barley GE, Pea RD, et al. Faculty development to change the paradigm of communication skills teaching in oncology. J Clin Oncol Off J Am Soc Clin Oncol. 2009;27: 1137–1141. pmid:19171703
  24. 24. Gysels M, Richardson A, Higginson IJ. Communication training for health professionals who care for patients with cancer: a systematic review of training methods. Support Care Cancer. 2005;13: 356–366. pmid:15586302
  25. 25. Smith S, Hanson JL, Tewksbury LR, Christy C, Talib NJ, Harris MA, et al. Teaching patient communication skills to medical students: a review of randomized controlled trials. Eval Health Prof. 2007;30. pmid:17293605
  26. 26. Duvivier RJ, van Dalen J, Muijtjens AM, Moulaert VR, van der Vleuten CP, Scherpbier AJ. The role of deliberate practice in the acquisition of clinical skills. BMC Med Educ. 2011;11: 101. pmid:22141427
  27. 27. Aspegren K. BEME Guide No. 2: Teaching and learning communication skills in medicine-a review with quality grading of articles. Med Teach. 1999;21: 563–570. pmid:21281175
  28. 28. Back AL, Arnold RM, Baile WF, Fryer-Edwards KA, Alexander SC, Barley GE, et al. Efficacy of communication skills training for giving bad news and discussing transitions to palliative care. Arch Intern Med. 2007;167: 453–460. pmid:17353492
  29. 29. Arora V, Gangireddy S, Mehrotra A, Ginde R, Tormey M, Meltzer D. Ability of hospitalized patients to identify their in-hospital physicians. Arch Intern Med. 2009;169: 199–201. pmid:19171817
  30. 30. O’Leary KJ, Kulkarni N, Landler MP, Jeon J, Hahn KJ, Englert KM, et al. Hospitalized Patients’ Understanding of Their Plan of Care. Mayo Clin Proc. 2010;85: 47–52. pmid:20042561
  31. 31. Banka G, Edgington S, Kyulo N, Padilla T, Mosley V, Afsarmanesh N, et al. Improving patient satisfaction through physician education, feedback, and incentives. J Hosp Med. 2015;10: 497–502. pmid:26014339
  32. 32. Moore PM, Rivera Mercado S, Grez Artigues M, Lawrie TA. Communication skills training for healthcare professionals working with people who have cancer. Cochrane Database Syst Rev. 2013; CD003751. pmid:23543521
  33. 33. Goldstein E, Farquhar M, Crofton C, Darby C, Garfinkel S. Measuring hospital care from the patients’ perspective: an overview of the CAHPS Hospital Survey development process. Health Serv Res. 2005;40: 1977–1995. pmid:16316434
  34. 34. Studor Group. Acknowledge, introduce, duration, explanation, and thank you. [Internet]. http://www.studergroup.com/aidet
  35. 35. Hosmer DW, Lemeshow S. Applied logistic regression. John Wiley and Sons; 2000.
  36. 36. hcahpsbrief.pdf [Internet]. http://essentialhospitals.org/wp-content/uploads/2013/12/hcahpsbrief.pdf
  37. 37. Torok H, Ghazarian SR, Kotwal S, Landis R, Wright S, Howell E. Development and validation of the tool to assess inpatient satisfaction with care from hospitalists. J Hosp Med. 2014;9: 553–558. pmid:24888242
  38. 38. O’Leary KJ, Cyrus RM. Improving patient satisfaction: Timely feedback to specific physicians is essential for success. J Hosp Med. 2015;10: 555–556. pmid:26212220
  39. 39. Wild DMG, Kwon N, Dutta S, Tessier-Sherman B, Woddor N, Sipsma HL, et al. Who’s behind an HCAHPS score? Jt Comm J Qual Patient Saf Jt Comm Resour. 2011;37: 461–468.
  40. 40. Baker R, Camosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, et al. Tailored interventions to overcome identified barriers to change: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2010; CD005470. pmid:20238340
  41. 41. Turner JS, Pettit KE, Buente BB, Humbert AJ, Perkins AJ, Kline JA. Medical student use of communication elements and association with patient satisfaction: a prospective observational pilot study. BMC Med Educ. 2016;16: 150. pmid:27209065
  42. 42. Scott J. Utilizing AIDET and other tools to increase patient satisfaction scores. Radiol Manage. 2012;34: 29–33-35.
  43. 43. Yoder E. Moving customer service scores with scripting and managing up techniques. Radiol Manage. 2007;29.
  44. 44. Handel DA, Fu R, Daya M, York J, Larson E, John McConnell K. The use of scripting at triage and its impact on elopements. Acad Emerg Med Off J Soc Acad Emerg Med. 2010;17: 495–500. pmid:20536803
  45. 45. Boehm FH. Teaching bedside manners to medical students. Acad Med J Assoc Am Med Coll. 2008;83: 534. pmid:18520454
  46. 46. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does Simulation-based Medical Education with Deliberate Practice Yield Better Results than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence. Acad Med J Assoc Am Med Coll. 2011;86: 706–711. pmid:21512370
  47. 47. Gysels M, Richardson A, Higginson IJ. Communication training for health professionals who care for patients with cancer: a systematic review of training methods. Support Care Cancer Off J Multinatl Assoc Support Care Cancer. 2005;13: 356–366. pmid:15586302
  48. 48. Dec_13_Jan_14_PublicReport_Apr_12_Mar_13_discharges_states.pdf [Internet]. http://hcahpsonline.com/files/Dec_13_Jan_14_PublicReport_Apr_12_Mar_13_discharges_states.pdf