Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Decentralizing evidence-based decision-making in resource limited setting: A case of SNNP region, Ethiopia

  • Misganu Endriyas ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Supervision, Visualization, Writing – original draft, Writing – review & editing

    misganuendrias@yahoo.com

    Affiliation SNNPR Health Bureau, Hawassa, Ethiopia

  • Abraham Alano,

    Roles Conceptualization, Funding acquisition, Methodology, Writing – original draft

    Affiliation SNNPR Health Bureau, Hawassa, Ethiopia

  • Emebet Mekonnen,

    Roles Conceptualization, Funding acquisition, Project administration

    Affiliation SNNPR Health Bureau, Hawassa, Ethiopia

  • Aknaw Kawza,

    Roles Conceptualization, Funding acquisition, Writing – original draft

    Affiliation SNNPR Health Bureau, Hawassa, Ethiopia

  • Fisha Lemango

    Roles Conceptualization, Methodology, Validation, Writing – original draft

    Affiliation SNNPR Health Bureau, Hawassa, Ethiopia

Abstract

Background

Access to and the use of accurate, valid, reliable, timely, relevant, legible and complete information is vital for safe and reliable healthcare. Though the study area has been implementing standardized Health Management Information System (HMIS), there was a need for information on how well structures were utilizing information and this study was designed to assess HMIS data utilization.

Methods

Facility based retrospective study was conducted in Southern Nations Nationalities and People’s Region (SNNPR) in April, 2017. We included data from 163 sample facilities. Data use was evaluated by reviewing eight items from performance monitoring system that included activities from problem identification to monitoring of proposed action plans. Each item reviewed was recoded to yes or no and summed to judge overall performance.

Results

About half (52%) of woredas, 26.2% health centers (HCs), 25% hospitals and 6.2% health posts (HPs) reviewed their performance monthly but only 20% woredas, 6.2% HCs, 1.5% HPs and no hospital prepared action plans after reviewing performance. Summary of 8 items assessed showed that majority of facilities (87.5% hospitals, 81.5% HPs and 70.8% HCs) were poor in data utilization.

Conclusions

Only about half of woredas and below one-fifth of health facilities were utilizing HMIS data and a lot to move to catch-up country’s information revolution plan. Lower health care systems should be supported in evidence-based decision-making and progress should be monitored routinely quantitatively and/or qualitatively.

Background

Health management information system (HMIS) is a process of recording, storing, retrieving and processing health data for decision-making [14]. Access to and the use of accurate, valid, reliable, timely, relevant, legible and complete information is vital for safe and reliable healthcare [5]. Improving HMIS helps decision makers to detect and control health problems, monitor and evaluate performance, and promote equity, empowering individuals and communities with health-related information and strengthening the evidence base for effective health policies [4, 612].

Globally, significant human and financial resources have been invested in the collection of data from different sources, including facility and community based sources and knowledge and understanding of its role have improved. However, this information is often not used by key stakeholders and as a result, many health programs fail to fully connect evidence to decisions needed to respond to priority needs [13] and is still very weak in most low- and middle-income countries (LMICs) [14].

The Paris Declaration and initiatives such as the Health Metrics Network, the Millennium Development Goals, and the Sustainable Development Goals have prompted LMICs to make the development of well-performing HMIS a high priority. WHO and the MEASURE Evaluation project have supported many LMICs in Africa and Asia and improvements have been documented in HMIS [14].

Ethiopia has been implementing HMIS at all health service delivery system levels for ensuring information use for evidence-based health planning and decision-making [15]. Ethiopia also revised HMIS reform focusing on rationalizing and standardizing indicators, data collection and reporting forms and procedures, and institutionalizing HMIS data quality assurance and information use mechanisms [16]. In addition, information revolution is one of the four transformation agendas in the current health sector transformation plan which focuses on addressing quality and equitable distribution of health service delivery for all [17].

In Southern Nations Nationalities and People’s Region (SNNPR), regional health bureau has been implementing electronic health management information system (eHMIS) and electronic integrated disease surveillance report (eIDSR) in collaboration with partners working in the area of HMIS. As part of data quality assurance, performance review teams are organized from health bureau to health facilities and regularly review performance and there is structure of reporting to higher level and giving feedback to lower structures. But designing and implementing the information system may not realize the information utilization [18]. So, there was a need for information on how well structures were utilizing information and this study was designed to assess HMIS data utilization in SNNPR Ethiopia.

Methods

Facility based retrospective document review of HMIS performance monitoring system was conducted in SNNPR Ethiopia in 2017. Administratively the region was divided into 14 zones, one city administration and four special woreda [19]. In the region, zones are divided in to woredas (districts). And special woreda is an administrative structure equivalent to district but not contained in zones and directly accountable to region. City administration is the capital of the region and divided to sub-cities. All administrative structures are then finally divided in to “Kebele”, the smallest administrative structure. So, the administrative structure from higher to lower is region, zone, woreda and kebele or region, special woreda and kebele and city administration is structured in a way region, city administration, sub-city and kebele.

In 2017, the region had 57 hospitals of all type (specialized, general and primary), 736 health centers and 3865 community level health posts. In Ethiopia, health post is community level (Kebele level) health services delivery point providing set of services mainly preventive and community services. Services are provided by usually two health extension workers who have got basic trainings after completion of high school and recruited by government. Health center is health facility that delivers standard primary health care, both preventive and curative services and supports on average five catchment health posts.

The study included assessment of data quality and utilization in sample of 163 institutions of different types: 65 health posts, 65 health centers, 8 hospitals and 25 woreda health offices. The sample size was estimated using sample size formula for facility survey and allocated to zones and special woredas proportionally considering number of functional facilities. Even though number of community level health post was high, we assessed one health post from health posts reporting to selected health center. Facilities were selected using multi-stage sampling. First, woredas were selected and at second level, facilities were selected using simple random sampling. For data quality, data accuracy of selected data elements from routine HMIS reports was assessed. To do this, recounts in registers (or tally in case of health posts) and reports over the same period were compared and difference was reported using verification factor. The detail about the region and procedure is available in data accuracy assessment part [20]

We used data from October to December, 2016 as it was latest completed quarter of budget year. BSc holder nurses and health officers were recruited to collect data considering prior experience in HMIS and data collection and were trained (with field-test) for three days on tools and procedures including ethics. Daily supervision was done by supervisors who had MSc and experience in HMIS and data collection. All collected data were examined for completeness and consistency of data and checklists that were incomplete or inconsistent were re-administered. Data were entered, cleaned and analyzed using IBM SPSS for Windows version-20. Descriptive statistics were done to describe variables using actual numbers and percentages.

The MEASURE evaluation guide suggested steps to use routine data to improve programs byproviding guidance in linking questions of interest to program managers and providers to existing data; analyzing, graphing, and interpreting data; and continuing to monitor key indicators to inform improvements. The steps indicated are identifying questions of interest, prioritizing interests, identifying data sources, transforming data, interpreting information, designing solutions and taking actions and monitoring performances [21].

To assess utilization of data, we reviewed activities of performance review team /PRT/. This included problem identification, problem prioritization, preparing action plan, monitoring implementation of action plan, data visualization and assessing data quality. In the study area, PRT is a team comprising decision-makers and technical personnel that review performance on monthly basis, prepare action plans, communicate it to concerned bodies and monitor implementation of action plans. To assess these issues and judge overall data utilization, we summed performance of 8 items: availability of minute of PRT, minute addressing high and low performances, problem prioritization (action plan), actions (activities report based on action plan), data presentations (up-to-date infant immunization monitoring and up-to-date malaria monitoring chart if there was malaria case report) and data quality assessment.

Availability of minute written during PRT meeting was assessed and was examined if the discussions covered high and low performances to identify lessons and gaps for planning. Immunization and malaria monitoring charts were assessed as part of data visualization for easy capturing. Immunization chart shows monthly and cumulative coverage of infant immunization for budget year and malaria monitoring chart shows trend of malaria incidence compared with expected threshold. Action plan shows priority activities identified to give attention towards the target, including responsible person and time frame. After preparing action plan, the plan is communicated to responsible person and implementation of action plan is followed by reviewing reports from responsible persons. Finally, data quality assessment is key component of data use by which data in hand is examined for its accuracy. To do data quality (in terms of accuracy) assessments, reports are compared with source document over the same period. Source document can be registration book or tally or reports from which aggregate was compiled depending on data elements selected.

Each observed item was recoded to yes (1) or no (0) based on its availability at time of assessment. To recode PRT performance items available or yes (1), we considered availability of two or more demonstrations in the quarter while display items like infant immunization and malaria monitoring charts were checked for being up-to-date. Finally, we categorized performance of 75% or more as good utilization, 50–74% as fair utilization and <50% as poor utilization.

Ethical clearance was obtained from SNNPR Regional Health Bureau Ethical Review Board. After getting ethical clearance, official letter was written to each study areas. Verbal consent was also obtained from each individual respondent and data handler after through explanation of the purpose, benefit, risk and confidentiality of the study. The information obtained kept anonymous and confidential.

Results

The study covered 25 Woreda health offices, 73 health facilities (65 health centers and 8 hospitals) and 65 health posts, a total of 163 institutions from 14 zones, 1 city administration and one special woreda.

Evidence based decision making system

As part of evidence based decision making system, items assessed were summarized in to problem identification, problem prioritization, preparing action plan, monitoring implementation of action plan, data display and assessing data quality.

A. Problem identification.

As part of problem identification, we considered presence of organized performance review team minutes documented during review meetings and whether contents addressed high and low performance figures. All woredas, 98.5% HCs, 87.5% hospitals and 53.8% HPs could show lists of members of performance review team. Three minutes on monthly performance review were expected from each facility over three months. Table 1 shows number of minutes in quarter and whether the contents addressed high and low performance values. Half (50.8%) of HPs had no minute while only 6.2% had complete three minutes in the quarter, from which only 1.5% addressed high and low performances. Compared to facilities, woreda was performing better in that about half (52%) had complete three minutes addressing both high and low performances and only eight percent no minute in the quarter.

thumbnail
Table 1. Problem identification by public health facilities, SNNPR, 2017.

https://doi.org/10.1371/journal.pone.0236637.t001

B. Problem prioritization.

After reviewing performance, the performance review team is expected to plan actions to observed (identified) gaps. Even though 64% woredas, 69.2% HCs, 50% hospitals and 30.8% HPs reported that they prepare action plans, only less than half of facilities prepared action plans (Fig 1). More specifically, 20% woredas, 0% hospitals, 6.1% HCs and 1.5% HPs prepared complete three action plans in the quarter.

C. Action on identified gaps.

After preparing action plan, the prepared action plan should be communicated to responsible person and activities performed should be monitored by performance review team for implementation of agreed action plan. Sixteen (64%) woredas, 39 (60%) HCs, 3 (37.5%) hospitals and 18 (27.7%) HPs reported that they implement action plans prepared and monitor it. The implementation of action plan was measured by reviewing minute of subsequent meeting for its content discussing implementation status of previous action plan and/or actions reported by responsible person. And 48% woredas, 52.3% HCs, 62.5% hospitals and 81.5% HPs had no data indicating implementation of action plan while only 24% woredas, 3.1% HCs, 0% hospitals and 3.1% HPs showed complete three reports on previous action plan or minutes of reviewing of previous action plan (Table 2).

thumbnail
Table 2. Preparing action plan for identified gaps by public health facilities, SNNPR, 2017.

https://doi.org/10.1371/journal.pone.0236637.t002

D. Data display.

Data display helps concerned bodies to easily capture information on performance. In the study area, the following data are usually included during routine performance assessments among other key performance measurement indicators. Two-thirds (60%) woredas, 58.5% HCs, 50% hospitals and 36.9% HPs displayed up-to-date EPI chart (Table 3).

thumbnail
Table 3. Data display by public health facilities, SNNPR, 2017.

https://doi.org/10.1371/journal.pone.0236637.t003

E. Data quality assessment.

For proper decision making, data in use should be checked for its quality. One of methods of checking data quality is conducting self-data quality assessment routinely, usually monthly for health facilities and quarterly for woredas. Eight (32%) woredas, 53(81.5%) HCs, 8(100%) hospitals and 65 (100%) HPs reported that they routinely conduct data quality assessment. The number of data quality assessments performed (observed) in the quarter was presented in Fig 2.

thumbnail
Fig 2. Percent of facilities conducting data quality assessment.

https://doi.org/10.1371/journal.pone.0236637.g002

Overall summary

In general, considering that data utilization comprises problem identification, prioritization, taking action and monitoring actions, overall data utilization score was computed for easy judgment using 8 items from above results. The mean score of woreda, HCs, hospitals and HPs was 55%, 36%, 26% and 29% respectively. Majority of facilities (87.5% hospitals, 81.5% HPs and 70.8% HCs) were poor in data utilization or scored below 50% (Fig 3).

thumbnail
Fig 3. Percentage of public health facilities utilizing HMIS data, SNNPR, 2017.

https://doi.org/10.1371/journal.pone.0236637.g003

Discussion

This study assessed data utilization at different levels of public health system in the study area. The results showed that about half of woredas and below one fifth of facilities were utilizing data in a proper way.

The ultimate goal of HMIS is using information to solve problems, which requires building an information use culture over time [14] and improving the health system’s ability to respond to health needs at all levels [22]. In addition, evidence-based decision making has become a benchmark of best practice [23]. To improve the quality of care provided, health systems need to utilize data at all levels, from providers to local and national-level decision-makers [24].

To use data as information, it should be analyzed and interpreted and it is usually done by performance review team (PRT) in the study area. The PRT is a team comprising decision-makers and technical personnel responsible for data analysis and informing decision-makers. The review process is expected to be conducted every month but only 13 (52%) woredas, 17(26.2%) HCs, 2(25%) hospitals and 4 (6.2%) HPs performed it monthly and only 20% woredas, 6.2% HCs, 0% hospitals and 1.5% HPs prepared action plans after review. Summary of 8 items from performance monitoring system showed that majority of facilities (87.5% hospitals, 81.5% HPs and 70.8% HCs) were poor in data utilization (Fig 3). Even though Ethiopia has been implementing standardized HMIS for data capturing, reporting, analyzing and use, lower structures are not effectively using it. This showed that designing and implementing the information system may not realize the information utilization goal as it was noted in the case of Malawi HMIS which showed no matter how good the design of an information system, yet very little improvement has been noted in use of information in rationalizing decisions [18].

Weak HMIS are a critical challenge to reaching the health-related goals because health systems performance cannot be adequately monitored [25]. Improving access to and use of data for decision-making, particularly at subnational levels was reported as significant potential to health system strengthening [26]. Studies including those conducted in developed countries reported that even though study participants desire data use, a relatively small number practice it in local health departments associating it with leadership, workforce capacity (number and skills), resources, funding and program mandates, political support, and access to data and program models suitable to community conditions[27]. And these factors that influence data quality and use are summarized in to technical, behavioral and organizational factors [28, 29].

Most LMICs health system decentralization has increased the decision-making responsibility of sub-national management teams [24]. A literature review done to assess data quality and use in LMICs showed that information use is low and indicated that individual training efforts focus primarily on imparting data analysis skills [12]. Studies done in Nigeria [30] and Tanzania [31] also reported low information use adding technical skill gap as challenge. And limited resources, time constraints, and negative attitudes (or resistance) are some of barriers to implementing supports for evidence-informed decision-making [32].

Even though studies assessing data use were done in Ethiopia, it was difficult to compare levels of data use with current study as focus of measurement was different among studies. One of studies [33] focused at individual level HMIS data utilization and reported good level of use while the other [34] focused on data presentation and reported low level of utilization.

In addition to decentralized decision-making, implementation research done in Ghana reported that several district action plans offer a statistical basis to justify program interventions and funding requests [22]. In resource limited setting like the study region, local evidence based problem identification and prioritization are critical. So, we believe that lower health system should be empowered to ensure data use. Engaging frontline staff and managers in improving data collection and its use is highlighted as important way in improving HMIS [25].

A study reported that facilitating roles that actively promote research use within the organization, establishing ties to researchers and opinion leaders outside the organization, a technical infrastructure that provides access to research evidence (like databases) and participation in training programs to enhance staff's capacity facilitate evidence-informed decision-making [35]. Another study demonstrated the potential for improving evidence-based decision-making capacity among local health department practitioners using a train-the-trainer approach involving diverse partners [36].

Even though we measured data use considering key components of monitoring and evaluation system for evidence-based decision-making, the study was limited in linking data use to level of performance; that is we did not correlate whether those institutions fulfilling our measurement criteria were performing better or not.

Conclusions

Only about half of woredas and below one-fifth of health facilities were utilizing HMIS data and a lot to move to catch-up country’s information revolution plan. Lower health care systems should be supported in evidence-based decision-making and progress should be monitored routinely quantitatively and/or qualitatively.

References

  1. 1. World Health Organization. Developing health management information systems: A practical guide for developing countries. World Health Organization; 2004.
  2. 2. Krishnan A, Nongkynrih B, Yadav K, Singh S, Gupta V. Evaluation of computerized health management information system for primary health care in rural India. BMC health services research. 2010;10:310. pmid:21078203
  3. 3. World Health Organization. Monitoring the building blocks of health systems: A handbook of indicators and their measurement strategies. World Health Organization; 2010.
  4. 4. World Health Organization. Toolkit on monitoring health systems strengthening health information systems. World Health Organization; 2008.
  5. 5. Health Information and Quality Authority. Guidance on developing Key Performance Indicators and Minimum Data Sets to Monitor Healthcare Quality Health Information and Quality Authority; 2013.
  6. 6. Elias Mossialos, Irene Papanicolas, Smith PC. Performance measurement for health system improvement: experiences, challenges and prospects. World Health Organization 2008.
  7. 7. WHO. Improving data quality: a guide for developing countries. WHO; 2003.
  8. 8. WHO. Assessment of health facility data quality: Data quality report of Cambodia. World Health Organization, 2012.
  9. 9. Zodpey SP, Negandhi HN. Improving the quality and use of routine health data for decision-making. Indian journal of public health. 2016;60(1):1–3. pmid:26911210
  10. 10. Kumar M, Mostafa J, Ramaswamy R. Federated health information architecture: Enabling healthcare providers and policymakers to use data for decision-making. Health information management: journal of the Health Information Management Association of Australia. 2017:1833358317709704.
  11. 11. Cohen DJ, Keller SR, Hayes GR, Dorr DA, Ash JS, Sittig DF. Integrating Patient-Generated Health Data Into Clinical Care Settings or Clinical Decision-Making: Lessons Learned From Project HealthDesign. JMIR human factors. 2016;3(2):e26. pmid:27760726
  12. 12. Kumar M, Gotz D, Nutley T, Smith JB. Research gaps in routine health information system design barriers to data quality and use in low- and middle-income countries: A literature review. The International journal of health planning and management. 2018;33(1):e1–e9. pmid:28766742
  13. 13. MEASURE Evaluation. Strengthening an Organization’s Capacity to Demand and Use Data. MEASURE Evaluation 2011.
  14. 14. Lippeveld T. Routine Health Facility and Community Information Systems: Creating an Information Use Culture. Glob Health Sci Pract. 2017;5(3):338–40. pmid:28963169
  15. 15. Ministry of Health [Federal Democratic Republic of Ethiopia]. National Health Information System Road Map. Ministry of Health [Federal Democratic Republic of Ethiopia]; 2012.
  16. 16. Belay H, Azim T, Kassahun H. Assessment of Health Management Information System (HMIS) Performance in SNNPR, Ethiopia. SNNP Regional Health Bureau and Measure evaluation, 2013.
  17. 17. Federal Ministry of Health [Ethiopian]. Information Revolution Roadmap. Federal Ministry of Health [Ethiopian]; 2016.
  18. 18. Chet NC, Christon MM, Jaap K, Humphrey BM, Thokozani CS, Ferdinand MK, et al. Design and implementation of a health management information system in Malawi: issues, innovations and results. Oxford University Press. 2005.
  19. 19. SNNP Region Overview: Southern Nations, Nationalities and Peoples Regional State Health Bureau; [cited 2016 April 25]. Available from: http://www.snnprhb.gov.et/index.php?option=com_content&view=article&id=9:sexteneded&catid=2:vision-mission&Itemid=39.
  20. 20. Endriyas M, Alano A, Mekonnen E, Ayele S, Kelaye T, Shiferaw M, et al. Understanding performance data: health management information system data accuracy in Southern Nations Nationalities and People's Region, Ethiopia. BMC health services research. 2019;19(1):175. pmid:30885204
  21. 21. Judice NR. Seven Steps to Use Routine Information: A Guide for HIV/AIDS Program Managers and Providers to Improve HIV/AIDS Programs. MEASURE Evaluation, 2009.
  22. 22. MEASURE Evaluation. Data Demand and Information Use Case Study Series: Using HIV/AIDS Information to Improve Local HIV/AIDS Programs in Ghana. MEASURE Evaluation, 2012.
  23. 23. Forrestal EJ. Foundation of evidence-based decision making for health care managers, part 1: systematic review. The health care manager. 2014;33(2):97–109. pmid:24776828
  24. 24. Wagenaar BH, Hirschhorn LR, Henley C, Gremu A, Sindano N, Chilengi R, et al. Data-driven quality improvement in low-and middle-income country health systems: lessons from seven years of implementation experience across Mozambique, Rwanda, and Zambia. BMC health services research. 2017;17(Suppl 3):830. pmid:29297319
  25. 25. Mutale W, Chintu N, Amoroso C, Awoonor-Williams K, Phillips J, Baynes C, et al. Improving health information systems for decision making across five sub-Saharan African countries: Implementation strategies from the African Health Initiative. BMC health services research. 2013;13 Suppl 2:S9.
  26. 26. Nove A, Hulton L, Martin-Hilber A, Matthews Z. Establishing a baseline to measure change in political will and the use of data for decision-making in maternal and newborn health in six African countries. International journal of gynaecology and obstetrics: the official organ of the International Federation of Gynaecology and Obstetrics. 2014;127(1):102–7.
  27. 27. Sosnowy CD, Weiss LJ, Maylahn CM, Pirani SJ, Katagiri NJ. Factors affecting evidence-based decision making in local health departments. American journal of preventive medicine. 2013;45(6):763–8. pmid:24237920
  28. 28. Li M, Brodsky I, Geers E. Barriers to Use of Health Data in Low- and MiddleIncome Countries A Review of the Literature. MEASURE Evaluation, 2018.
  29. 29. Aqil A, Hozumi D, Lippeveld T. Tools for Data Demand and Use in the Health Sector: Performance of Routine Information Systems Management (PRISM) Tools. MEASURE Evaluation; 2011.
  30. 30. Adewuyi AA, Akinlo A. Decision Maker Perceptions in Nigeria: An Assessment of Data Use Constraints. MEASURE Evaluation, 2005.
  31. 31. Harrison T, Bakari B. Assessment of data use constraints in Tanzania: Decision makers’ perceptions. MEASURE Evaluation, 2008.
  32. 32. Ellen ME, Leon G, Bouchard G, Ouimet M, Grimshaw JM, Lavis JN. Barriers, facilitators and views about next steps to implementing supports for evidence-informed decision-making in health systems: a qualitative study. Implementation science: IS. 2014;9:179. pmid:25476735
  33. 33. Dagnew E, Woreta SA, Shiferaw AM. Routine health information utilization and associated factors among health care professionals working at public health institution in North Gondar, Northwest Ethiopia. BMC health services research. 2018;18(1):685–. pmid:30180897
  34. 34. Abajebel S, Jira C, Beyene W. Utilization of health information system at district level in jimma zone oromia regional state, South west ethiopia. Ethiop J Health Sci. 2011;21(Suppl 1):65–76. pmid:22435010
  35. 35. Ellen ME, Leon G, Bouchard G, Lavis JN, Ouimet M, Grimshaw JM. What supports do health system organizations have in place to facilitate evidence-informed decision-making? A qualitative study. Implementation science: IS. 2013;8:84.
  36. 36. Jacobs JA, Duggan K, Erwin P, Smith C, Borawski E, Compton J, et al. Capacity building for evidence-based decision making in local health departments: scaling up an effective training approach. Implementation science: IS. 2014;9:124.