Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

A Systematic Review of Reporting Tools Applicable to Sexual and Reproductive Health Programmes: Step 1 in Developing Programme Reporting Standards

  • Anna Kågesten ,

    Contributed equally to this work with: Anna Kågesten, Ӧzge Tunçalp

    akaagesten@jhu.edu

    Affiliation Department of Population, Family and Reproductive Health, Johns Hopkins School of Public Health, Baltimore, Maryland, United States of America

  • Ӧzge Tunçalp ,

    Contributed equally to this work with: Anna Kågesten, Ӧzge Tunçalp

    Affiliation WHO Department of Reproductive Health and Research, including UNDP/UNFPA/UNICEF/WHO/World Bank Special Programme of Research, Development and Research Training in Human Reproduction, World Health Organization, Geneva, Switzerland

  • Moazzam Ali ,

    ‡ These authors also contributed equally to this work.

    Affiliation WHO Department of Reproductive Health and Research, including UNDP/UNFPA/UNICEF/WHO/World Bank Special Programme of Research, Development and Research Training in Human Reproduction, World Health Organization, Geneva, Switzerland

  • Venkatraman Chandra-Mouli ,

    ‡ These authors also contributed equally to this work.

    Affiliation WHO Department of Reproductive Health and Research, including UNDP/UNFPA/UNICEF/WHO/World Bank Special Programme of Research, Development and Research Training in Human Reproduction, World Health Organization, Geneva, Switzerland

  • Nhan Tran ,

    ‡ These authors also contributed equally to this work.

    Affiliation Implementation Research Platform, Alliance for Health Policy and Systems Research, World Health Organization, Geneva, Switzerland

  • A. Metin Gülmezoglu

    ‡ These authors also contributed equally to this work.

    Affiliation WHO Department of Reproductive Health and Research, including UNDP/UNFPA/UNICEF/WHO/World Bank Special Programme of Research, Development and Research Training in Human Reproduction, World Health Organization, Geneva, Switzerland

Abstract

Background

Complete and accurate reporting of programme preparation, implementation and evaluation processes in the field of sexual and reproductive health (SRH) is essential to understand the impact of SRH programmes, as well as to guide their replication and scale-up.

Objectives

To provide an overview of existing reporting tools and identify core items used in programme reporting with a focus on programme preparation, implementation and evaluation processes.

Methods

A systematic review was completed for the period 2000–2014. Reporting guidelines, checklists and tools, irrespective of study design, applicable for reporting on programmes targeting SRH outcomes, were included. Two independent reviewers screened the title and abstract of all records. Full texts were assessed in duplicate, followed by data extraction on the focus, content area, year of publication, validation and description of reporting items. Data was synthesized using an iterative thematic approach, where items related to programme preparation, implementation and evaluation in each tool were extracted and aggregated into a consolidated list.

Results

Out of the 3,656 records screened for title and abstracts, full texts were retrieved for 182 articles, out of which 108 were excluded. Seventy-four full text articles corresponding to 45 reporting tools were retained for synthesis. The majority of tools were developed for reporting on intervention research (n = 15), randomized controlled trials (n = 8) and systematic reviews (n = 7). We identified a total of 50 reporting items, across three main domains and corresponding sub-domains: programme preparation (objective/focus, design, piloting); programme implementation (content, timing/duration/location, providers/staff, participants, delivery, implementation outcomes), and programme evaluation (process evaluation, implementation barriers/facilitators, outcome/impact evaluation).

Conclusions

Over the past decade a wide range of tools have been developed to improve the reporting of health research. Development of Programme Reporting Standards (PRS) for SRH can fill a significant gap in existing reporting tools. This systematic review is the first step in the development of such standards. In the next steps, we will draft a preliminary version of the PRS based on the aggregate list of identified items, and finalize the tool using a consensus process among experts and user-testing.

Introduction

Reporting of the key implementation elements of programmes in the field of sexual and reproductive health (SRH) is essential to understand the impact of the programmes, as well as to guide the efforts for future replication and scale-up. Indeed, readers of a programme report or publication need clear and complete information about the programme components, their development, implementation and evaluation, to be able to assess its quality as well as replicate the programme model [1]. However, the reality is that many programmes report on results and impacts without describing how, when, where and under what conditions programmes were developed and implemented [2]. In a systematic review on comprehensive adolescent health programmes inclusive of SRH services, Kågesten et al. [3] found substantial inconsistencies in the depth and scope of programme component descriptions. In both the peer-reviewed and grey literature, many publications and reports lacked a clear description of programme activities and their implementation. Consequently, programmes may demonstrate impact without providing details as to how results were obtained and how components can be replicated. The lack of an adequate description of implementation processes is not unique to program reporting, but widely recognized in relation to the reporting of clinical trials and other research designs. For example, Chalmers and Glasziou [4] estimated that over 30% of clinical trials and over 50% of planned study outcomes were not sufficiently described in publications, representing “billions of dollars” in avoidable reporting waste. Further analyses showed that between 40% and 89% of biomedical interventions were non-replicable because of inadequate description of intervention components [5].

The key underlying reason for varying quality and levels of details is the absence of standards for programme reporting in SRH. In 1996, the lack of adequate reporting on randomized clinical trials prompted the development of the Consolidated Standards of Reporting Trials (CONSORT) [6], and subsequent statements have been developed for reporting on study designs beyond randomized controlled trials such as non-randomized evaluations [7] and qualitative studies [8]. However, scholars increasingly emphasize the need for greater clarity on what and how to report in relation to programme preparation, implementation and evaluation processes to better facilitate replication and scale up irrespective of the study design used [2,9].

In response to this gap, the World Health Organization (WHO) Department of Reproductive Health and Research, including the UNDP/UNFPA/UNICEF/WHO/World Bank Special Programme of Research, Development and Research Training in Human Reproduction (HRP), in partnership with the Alliance for Health Policy and Systems Research hosted by the WHO, initiated a consultative process to develop Programme Reporting Standards (PRS) to be used by programme implementers and researchers in the field of SRH. The overall goal is to improve the quality of programme reporting in order to allow others to replicate the programme, as well as to better understand and document the success and barriers in its implementation. In line with recommendations for developing reporting guidelines provided by Moher et al [10], the current systematic review is the first step in the development of the PRS. The objectives of the systematic review are two-fold: 1) to provide an overview of available reporting guidelines and tools that have been used, or are suitable to use, for SRH programmes; and 2) identify core items used in programme reporting with a focus on programme preparation, implementation and evaluation processes, to be included in a draft tool.

Defining key terms

Our primary interest for the present review is the reporting of programmes, whether by researchers or programmers. According to the Dictionary of Epidemiology [11], a programme is a “(formal) set of procedures to conduct an activity, e.g. control of malaria”, whereas an intervention study involves an “intentional change in some aspect of the status of subjects, e.g. introduction of a preventive or therapeutic regimen or an intervention designed to test a hypothesized relationship”. A programme may or may not be interventional in nature. However, because these terms are often taken to mean the same thing, we used the terms programme and intervention interchangeably to refer to a formal set of prevention, promotion and/or intervention activities. [1] We further used the term programme components in reference to the elements or activities that comprise a programme.

A key challenge in reviewing literature on reporting tools is the highly varying terminology used to describe programmes [1]. When describing individual studies, we therefore strived to retain the terminology used in the original publications. Finally, we used the terms items and reporting items interchangeably to refer to items included in reporting checklists or other tools (for example, the CONSORT statement has 21 items). Our goal was to identify a set of core items for potential inclusion in a PRS tool focused on SRH.

Materials and Methods

We used a modified version of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement [12] to conceptualize and carry out the current systematic review. Each step of the review was specified in a protocol for the overall PRS project (the protocol was not published but it is available in S1 Text).

Eligibility criteria

For the purpose of this review, we included any study or article that described a reporting guideline or tool that has been used, or would be suitable to use, for reporting on programmes in the field of SRH. In line with Moher et al [13], we defined a reporting guideline as a “checklist, flow diagram, or explicit text to guide authors in reporting a specific type of research”. Because the focus of our review extended beyond research reporting, we also included checklists or guiding texts developed for programme reporting outside of academia (e.g. by implementing organizations and donors). For the purposes of simplicity, from hereon we refer to all guidelines, checklists and other guiding specifications as “tools”. Finally, we included articles that outlined narrative recommendations for programme reporting, even if these did not present official tools. All included articles had to describe a tool or provide unique recommendations relevant to programme reporting, and be published between January 2000 and September 2014. We chose not to limit the search by programme or study design in order to capture as many relevant tools as possible. No language restrictions were applied.

As mentioned above, reporting tools had to have been used, or be applicable to use, for reporting on programmes targeting SRH outcomes. In line with the WHO’s mandate on SRH [14], such outcomes include but are not limited to: maternal mortality and morbidity, abortion, sexually transmitted infections and HIV prevention and treatment including mother-to-child transmission, adolescent pregnancy, family planning, safe abortion care, pregnancy and childbirth care, postnatal care of mother and newborn, and prevention and management of gender-based violence. By the term ‘applicable to use’, we mean tools used in the wider field of public health and medicine that may be relevant or suitable for SRH programmes even though the tools were not developed specifically for such outcomes (given that many of the issues central to programme reporting are not unique to the field of SRH). Two reviewers (AK, ÖT) evaluated which tools were applicable for inclusion. Those excluded were 1) tools that were minor modifications of an already established tool; 2) studies that merely assessed the quality of reporting or reviewed existing reporting tools; and 3) comments or editorials about a tool (unless these elaborated on items not otherwise included in existing tools).

Information sources and search strategy

We searched six electronic databases: PubMed, Scopus, PsychInfo, Embase, MEDLINE and Global Health for the period January 2000 through September 2014. All database searches were run during the week of 1 September 2014. We developed a core search strategy combining MeSH terms with key words for use in PubMed. The strategy was built in three blocks: reporting tool/guideline AND programme/intervention AND SRH/Health, and further adapted according to the standards and relevant MeSH terms for each database. The full search strategy for each database is available in S1 Table.

For the grey literature, we conducted a focused search on identifying reporting tools used by donors. The selection of donors was based on those providing support to the HRP. Website searches of implementing organizations in the field of SRH were beyond the scope of the current review. Such organizations will, however, be included in the next steps of the PRS development. Other sources of data included reference lists of key articles and background documents [1,2,4,5,10,13,15], one example being a review about reporting guidelines in health research [13]. We also searched the library for the Enhancing the QUAlity and Transparency of Health Research (EQUATOR) network (http://www.equator-network.org/library/), a resource bank of reporting guidelines. The latter search was focused on tools for reporting on interventions and implementation.

Study selection

Following the search process, two reviewers (AK, ÖT) divided all records and independently screened the titles and abstracts. Full texts were obtained for all articles that passed the initial screening. The same two reviewers (AK, ÖT) assessed all full texts, in duplicate, with inconsistencies resolved through discussion. Full texts were included if they met all inclusion criteria; all reasons for exclusion were recorded.

Data extraction

Data was extracted using a standardized template across the following domains (see S2 Table for a detailed summary of the extracted data):

  • Background details (author(s), year(s) of publication, journal(s) or other sources).
  • Focus of tool (e.g. for reporting on a specific study design).
  • Content area (e.g. for reporting in a specific field such as HIV).
  • Number and description of reporting items included in the tool, or a summary of the recommended items for reporting if described in narrative format.
  • Number and description of reporting items specific to programme preparation, implementation and evaluation.
  • Validation (piloting or other modes of testing or validating the tool).

Each tool could have one or more sources; that is, data concerning the same tool could be extracted from different journal publications. One person in the research team (AK) extracted data from each included article. A second reviewer (ÖT) verified the extracted data for a random sample of 20% of included tools; inter-rater consistency (proportion of agreement) was over 95%.

Synthesis of results

We applied an iterative, thematic approach to the synthesis of textual data from the included tools. Based on the number of items specific to programme preparation, implementation and evaluation, tools were initially ranked into high (all items or overall focus of tool), moderate (some items even if not the focus of the tool, or narrative discussion or relevant reporting themes) or low (one item or less, not the focus of the tool) relevance. In the first step, all extracted items were reviewed for their applicability to programme reporting and aggregated into a compiled list. This list included the original item, a brief description and its corresponding tool. In the next step, we used an inductive coding process where each item was coded according to its programme reporting domain (e.g. implementation outcome) and potential sub-domains (e.g. fidelity). We conducted iterative reviews of the extracted items to identify and refine domains and sub-domains, during which items and codes that were similar or redundant were merged. Items that were judged by the reviewers as inapplicable to programme preparation, implementation and evaluation processes were removed. This distinction was based on items already included in existing guidelines for reporting on study designs and results (e.g. CONSORT or non-randomized alternatives). The final list of items was organized according to their main corresponding domain and sub-domain. Because of the nature of the review, where the main focus was to provide a narrative description of items, the reporting of analytical comparative measures such as odds ratios were not applicable.

Assessment of quality

Similar to previous systematic reviews of reporting guidelines [13], we did not appraise the methodological quality of tools, including of risk of bias within and across studies. The rationale for this was that the review sought to describe existing programme reporting tools rather than assess the quality of reporting, or the effectiveness or impact of programmes. As part of the synthesis process, there was an assessment on whether existing tools had been piloted or used widely based on the reported use in different geographical settings or endorsement by organizations and/or journals.

Results

Characteristics of included tools

We screened the title and abstract of 3,656 records. Full texts were retrieved for 182 articles of which 108 were excluded; all reasons for exclusion were recorded. In total, 74 full text articles were retained for data extraction (Fig 1).

thumbnail
Fig 1. PRISMA 2009 Flowchart of screening and data extraction process.

The majority of articles (96%) were published in peer-reviewed journals, the most common being BMC Medical Education or BMC Medical Research Methodology (8%), PLoS Medicine (7%) and Journal of Clinical Epidemiology (7%). The included articles corresponded to 45 tools (Table 1) retained for synthesis.

https://doi.org/10.1371/journal.pone.0138647.g001

thumbnail
Table 1. Overview of included tools, by relevance to the current systematic review.

https://doi.org/10.1371/journal.pone.0138647.t001

The majority of tools were developed for reporting on intervention research (n = 15), randomized controlled trials (n = 8) and systematic reviews (n = 7). Other guideline focuses included observational studies (n = 3), diagnostic accuracy (n = 2), qualitative studies (n = 2), survey research (n = 1), general study designs (n = 1), determinants of practice (n = 1), patient/public involvement in research (n = 1), programme evaluation and monitoring (n = 1) programme reporting (n = 1), evaluation studies (n = 1) and implementation research (n = 1).

Reporting tools covered a wide range of content areas such as behaviour change, health informatics, mobile or e-health, equity, nursing and complex interventions. Of the included tools, three described reporting items specific to SRH (for example, details on sexual partners or HIV status) [2730]. The majority of tools presented a checklist of core items for reporting (ranging from 6–58 items), while others (n = 8) used a narrative description of essential reporting elements. About half the tools did not include items or topics specific to the description of programme preparation, implementation or evaluation. However, some tools listed items or included a narrative description of topics that could be indirectly related to these topics, such as a description of unexpected events that in turn may affect implementation.

Overall, 21 tools were ranked as having a high relevance to the present review, 11 were ranked as moderate and 13 as low relevance in line with the criteria described earlier. Four tools are especially worth mentioning because of their high relevance and recent publication. First, the template for intervention description and replication (TIDieR) [41] published in 2014 provide an itemized checklist for reporting on intervention studies, including items such as the intervention name, rationale, materials, procedures (how, by whom, when and where delivery occurred), as well as the dose, modifications and fidelity to the intervention. Secondly, in 2013 the workgroup for intervention development and evaluation research (WIDER) [42] outlined a number of recommendations for describing the development, content, setting, mode of delivery, intensity, duration, and fidelity of behavior change interventions. Thirdly, in 2013 Peters et al [9] proposed a set of guiding questions for reporting implementation research, including the description of implementation strategies, context, complexity and real-world conditions. The implementation terminology presented as part of this framework [9] was also used to organize the findings from the current review. Finally, in 2003, Davidson et al [20] provided eight recommendations for minimal detail in the reporting of behavioral medicine interventions, including content/elements, provider, format, setting, recipient, intensity, duration and fidelity.

Description of items

A total of 226 items related to programme preparation, implementation and evaluation processes were extracted and further consolidated into 50 items for potential inclusion in a PRS tool for SRH programmes. Items that were similar across multiple tools were merged, and the wording of items was changed accordingly. Where applicable, the wording was also changed from intervention to programme to better correspond to the purposes of the current review. The final list of items, their descriptions, corresponding domains and sub-domains, and sources are presented in Table 2.

thumbnail
Table 2. Reporting items related to programme preparation, implementation and evaluation.

https://doi.org/10.1371/journal.pone.0138647.t002

The items were organized according to three main domains: 1) programme preparation, 2) program implementation, and 3) programme evaluation processes. A number of corresponding sub-domains were also identified. The following section provides a brief description of each domain and sub-domain.

Programme preparation.

Three sub-domains were identified which related to programme preparation or planning. These include the programme’s objective/focus (overall goal, anticipated impact, and target population); how the programme was designed (organization(s) and donors involved in developing the program, theory of change or logic model, the process of designing programme activities, existence of a manual/protocol and implementation strategy); and piloting (whether and how activities were piloted, along with results from the pilot).

Programme implementation.

Six sub-domains emerged related to programme implementation. Programme content refers to the actual content of programme activities described with enough detail to allow replication; the complexity, number, level and innovation of activities; materials used and where to locate these. Timing, duration and location include items describing when and where programme activities were delivered, and the dose and intensity of activities. Programme providers/staff refer to who conducted the activities, as well as the training, characteristics, responsibilities and reflexivity of delivering staff. Programme participants cover who the actual recipients were; how participants were recruited and any preparation prior to the start of activities. Furthermore, programme delivery items describe how the programme activities were delivered; materials used; and efforts to ensure fidelity of both participants and staff. Finally, programme implementation outcomes refer to the actual acceptability and adoption of the programme; its coverage/reach, feasibility, modification, fidelity and reasons for low fidelity; appropriateness, implementation costs, reversibility, sustainability and unexpected events among other items.

Programme evaluation.

With regard to programme evaluation processes, three sub-domains were identified. Process or implementation evaluation includes items describing process evaluation methods; how the implementation process might have affected results; contextual/external events; and ethical considerations affecting implementation. Implementation barriers and facilitators relate to factors hindering or facilitating implementation, as well as appraised strengths and limitations of the overall programme. Finally, impact/outcome evaluation items describe the process of evaluating programme outcomes (differentiating between effectiveness, efficacy and cost savings) or any upcoming evaluation plans, and whether the programme had any unexpected negative or differential effects.

Discussion

While there is growing evidence about “what works” to improve SRH outcomes, less is known about “how-to” implement, replicate and scale-up programmes [9]. Many programmes describe outcomes and results but do not provide enough detail to allow others to understand what exactly was done, the evidence, and lessons learnt from implementation barriers and/or facilitators. The current systematic review sought to provide an overview of tools that may be used for reporting on SRH programmes, and to further identify core items for programme reporting.

We found that over the past decade a wide range of tools have been developed to improve the reporting of health research. Most of the identified tools were essentially guidelines for reporting on research study design and results and included none or few items relevant to programme reporting. A number of tools were, however, of greater relevance. In particular, recent tools such as TIDieR [41] and WIDER [42] may substantially improve the reporting of interventions using both randomized and non-randomized designs. For example, the TIDieR checklist [41] is a comprehensive list of items for reporting on how, where, when, by whom and with what fidelity interventions were implemented, thus moving beyond the single item provision of “sufficient” details on the intervention in order to allow replication used in guidelines such as CONSORT.

Nevertheless, these tools were developed specifically for intervention research and reporting in peer-reviewed journals. Although there seem to be increasing numbers of programme evaluations published in peer-reviewed journals and the existence of journals for this purpose (e.g. Global Health: Science and Practice), it is probably fair to assume that a significant proportion of programme reports are published on the web or in print outside the peer-reviewed journals. Deficiencies remain in the above frameworks for programme reporting, specifically as it relates to reporting of implementation strategies and outcomes. As recently noted by Glasziou et al [5], there is need for improved reporting beyond peer-reviewed journals that focus “more broadly at the multiple and various forms in which research processes and findings are reported”. Many programmes operate under complex, real world conditions making it difficult to communicate exactly what is being done, and how, in a timely and consistent manner. Coordinators, implementers, managers as well as researchers thus need a standardized way of documenting and reporting implementation strategies and outcomes throughout the course of the programme so that others can learn from their experiences. As a result, there is need for guidance for complete and accurate reporting on programme preparation, implementation and evaluation processes in real world contexts.

The current systematic review is the first step in the development of PRS for SRH, where we sought to provide a consolidated list of the types of items included in existing tools. While there was substantial diversity in the focus, scope and relevance of the tools reviewed, we identified 50 items related to the description of programme preparation, implementation and evaluation processes. Additional items and themes may be identified and suggested during subsequent steps of the PRS development. Specifically, in line with the recommendations by Moher et al [10], we will conduct a Delphi consensus exercise with a panel of experts to review and add to the list of items. This will be followed by a face to face consultative meeting to further refine and discuss the items, finalize the PRS and plan for its implementation. Finally, the PRS will be pilot-tested for user feasibility via different SRH programmes supported by the WHO. The specific purpose of the final PRS will be to help programme staff and researchers write reports and communicate key elements about how the programme was prepared, implemented and evaluated. The intended users of the PRS include programme staff and implementers writing reports to donors or for dissemination to external audiences. The PRS may also serve as a guide on what to include in peer-reviewed publications about programmes and their implementation processes. Finally, the PRS tool may function as a guide for upfront programme planning and implementation by outlining the essential elements that need to be reported on.

While every effort was made to undertake a comprehensive, systematic search of relevant literature, there are some limitations to the review. Because the review was restricted to peer-reviewed literature and selected grey literature sources published in the last 15 years it is possible that we missed relevant programme reporting tools that were published before or after this timeframe, or that may not be available to the public. One example is the Standards for Reporting Implementation Studies of Complex Interventions (StaRI) [84], which was recently published (2015) and therefore not captured by the current systematic search. We attempted to minimize this risk of bias by piloting our search strategy and implemented this in a number of databases relevant to SRH. We also searched the EQUATOR database, which provides a comprehensive listing of available reporting tools. Despite the use of a structured, piloted data extraction form it is possible that we overlooked some items. Finally, frameworks on developing a scale-up strategy, such as the WHO ExpandNet tool [85] and the SURE checklist for identifying factors affecting the implementation of a policy option [86], were beyond the scope of the current review based on its inclusion and exclusion criteria. We acknowledge the importance and utility of these tools and frameworks for programmes and therefore they may inform the subsequent steps of the PRS development.

As far as we know, this is the first systematic review of tools and items relevant to reporting of SRH programmes. The review thus fills an important gap in the literature on programme reporting.

Conclusions

Over the last decade a number of tools for reporting of research have been published. Recent initiatives have focused on improving the reporting of intervention research through guidelines such as TiDiER [13] and WIDER [12]. However, few tools include specific elements related to the description of programme preparation and implementation, and we could not locate any standardized tools for reporting of programmes in the field of SRH. Development of PRS for SRH programmes can therefore fill a significant gap in existing reporting tools. Specifically, the availability of PRS can help improve descriptions of programme preparation, implementation and evaluation processes, which in turn can guide replication and scale-up of successful models. This systematic review is the first step in the development of such standards; in the next steps, we will draft a preliminary version of the PRS based on the aggregate list of identified items, and finalize the tool using a consensus process among experts and user-testing.

Acknowledgments

The authors wish to thank Ms Dina Khan for assisting with the quality check process of extracted data and Ms Annette Peters for editing assistance.

Author Contributions

Conceived and designed the experiments: AK ÖT. Performed the experiments: AK ÖT. Analyzed the data: AK ÖT. Contributed reagents/materials/analysis tools: AK ÖT MA VCM NT AMG. Wrote the paper: AK ÖT MA VCM NT AMG.

References

  1. 1. Durlak JA, DuPre EP (2008) Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol 41: 327–350. pmid:18322790
  2. 2. Michie S, Fixsen D, Grimshaw JM, Eccles MP (2009) Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implement Sci 4: 40. pmid:19607700
  3. 3. Kågesten A, Parekh J, Tuncalp O, Turke S, Blum RW (2014) Comprehensive adolescent health programs that include sexual and reproductive health services: a systematic review. Am J Public Health 104: e23–36. pmid:25320876
  4. 4. Chalmers I, Glasziou P (2009) Avoidable waste in the production and reporting of research evidence. Lancet 374: 86–89. pmid:19525005
  5. 5. Glasziou P, Altman DG, Bossuyt P, Boutron I, Clarke M, Julious S, et al. (2014) Reducing waste from incomplete or unusable reports of biomedical research. The Lancet 383: 267–276.
  6. 6. Begg C, Cho M, Eastwood S, Horton R, Moher D, Olkin I, et al. (1996) Improving the quality of reporting of randomized controlled trials. The CONSORT statement. Jama 276: 637–639. pmid:8773637
  7. 7. Des Jarlais DC, Lyles C, Crepaz N (2004) Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: the TREND statement. Am J Public Health 94: 361–366. pmid:14998794
  8. 8. Tong A, Sainsbury P, Craig J (2007) Consolidated criteria for reporting qualitative research (COREQ): A 32-item checklist for interviews and focus groups. International Journal for Quality in Health Care 19: 349–357. pmid:17872937
  9. 9. Peters DH, Adam T, Alonge O, Agyepong IA, Tran N (2013) Implementation research: what it is and how to do it. Bmj 347. pmid:24259324
  10. 10. Moher D, Schulz KF, Simera I, Altman DG (2010) Guidance for developers of health research reporting guidelines. PLoS Med 7: e1000217. pmid:20169112
  11. 11. Last JM, Spasoff RB, Harris SG, editors (2000) A dictionary of epidemiology. 4th ed. New York: Oxford University Press.
  12. 12. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gotzsche PC, Ioannidis JP, et al. (2009) The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med 6: e1000100. pmid:19621070
  13. 13. Moher D, Weeks L, Ocampo M, Seely D, Sampson M, Altman DG, et al. (2011) Describing reporting guidelines for health research: a systematic review. J Clin Epidemiol 64: 718–742. pmid:21216130
  14. 14. WHO Sexual and Reproductive Health. The Department of Reproductive Health and Research including HRP.
  15. 15. Peters D, Tran N, Adam T (2013) Implementation Research in Health: A practical guide. Alliance for Health Policy and Systems Research, World Health Organization.
  16. 16. Wells M, Williams B, Treweek S, Coyle J, Taylor J (2012) Intervention description is not enough: evidence from an in-depth multiple case study on the untold role and impact of context in randomised controlled trials of seven complex interventions. Trials 13: 95. pmid:22742939
  17. 17. Mayo-Wilson E (2007) Reporting implementation in randomized trials: proposed additions to the consolidated standards of reporting trials statement. Am J Public Health 97: 630–633. pmid:17329641
  18. 18. Mayo-Wilson E, Grant S, Hopewell S, Macdonald G, Moher D, Montgomery P (2013) Developing a reporting guideline for social and psychological intervention trials. Trials 14: 242. pmid:23915044
  19. 19. Montgomery P, Grant S, Hopewell S, Macdonald G, Moher D, Michie S, et al. (2013) Protocol for CONSORT-SPI: An extension for social and psychological interventions. Implementation Science 8.
  20. 20. Davidson KW, Goldstein M, Kaplan RM, Kaufmann PG, Knatterud GL, Orleans CT, et al. (2003) Evidence-based behavioral medicine: what is it and how do we achieve it? Ann Behav Med 26: 161–171. pmid:14644692
  21. 21. Möhler R, Bartoszek G, Köpke S, Meyer G (2012) Proposed criteria for reporting the development and evaluation of complex interventions in healthcare (CReDECI): Guideline development. International Journal of Nursing Studies 49: 40–46. pmid:21924424
  22. 22. Mohler R, Bartoszek G, Meyer G (2013) Quality of reporting of complex healthcare interventions and applicability of the CReDECI list—a survey of publications indexed in PubMed. BMC medical research methodology 13: 125. pmid:24138207
  23. 23. Phillips AC, Lewis LK, McEvoy MP, Galipeau J, Glasziou P, Hammick M, et al. (2013) Protocol for development of the guideline for reporting evidence based practice educational interventions and teaching (GREET) statement. BMC Medical Education 13.
  24. 24. Phillips AC, Lewis LK, McEvoy MP, Galipeau J, Glasziou P, Hammick M, et al. (2014) A systematic review of how studies describe educational interventions for evidence-based practice: Stage 1 of the development of a reporting guideline. BMC Medical Education 14.
  25. 25. Phillips AC, Lewis LK, McEvoy MP, Galipeau J, Glasziou P, Hammick M, et al. (2014) A Delphi survey to determine how educational interventions for evidence-based practice should be reported: stage 2 of the development of a reporting guideline. BMC Med Educ 14: 159. pmid:25081371
  26. 26. Montgomery P, Underhill K, Gardner F, Operario D, Mayo-Wilson E (2013) The Oxford Implementation Index: a new tool for incorporating implementation data into systematic reviews and meta-analyses. J Clin Epidemiol 66: 874–882. pmid:23810026
  27. 27. Thomas CW, Smith BD, Wright-DeAgüero L (2006) The Program Evaluation and Monitoring System: A Key Source of Data for Monitoring Evidence-Based HIV Prevention Program Processes and Outcomes. AIDS Education and Prevention 18: 74–80. pmid:16987090
  28. 28. O'Neill J, Tabish H, Welch V, Petticrew M, Pottie K, Clarke M, et al. (2014) Applying an equity lens to interventions: using PROGRESS ensures consideration of socially stratifying factors to illuminate inequities in health. J Clin Epidemiol 67: 56–64. pmid:24189091
  29. 29. Kavanagh J, S O, Lorenc (2008) Reflections in developing and using. PROGRESS-Plus. Equity Update.
  30. 30. Flores SA, Crepaz N (2004) Quality of study methods in individual- and group-level HIV intervention research: critical reporting elements. AIDS Educ Prev 16: 341–352. pmid:15342336
  31. 31. Roen K, Arai L, Roberts H, Popay J (2006) Extending systematic reviews to include evidence on implementation: methodological work on a review of community-based initiatives to prevent injuries. Soc Sci Med 63: 1060–1071. pmid:16574289
  32. 32. Conn VS, Groves PS (2011) Protecting the power of interventions through proper reporting. Nurs Outlook 59: 318–325. pmid:21840555
  33. 33. Armstrong R, Waters E, Moore L, Riggs E, Cuervo LG, Lumbiganon P, et al. (2008) Improving the reporting of public health intervention research: Advancing TREND and CONSORT. Journal of Public Health 30: 103–109. pmid:18204086
  34. 34. Harrington NG, Noar SM (2012) Reporting standards for studies of tailored interventions. Health Education Research 27: 331–342. pmid:22156230
  35. 35. Bird VJ, Le Boutillier C, Leamy M, Williams J, Bradstreet S, Slade M (2014) Evaluating the feasibility of complex interventions in mental health services: Standardised measure and reporting guidelines. British Journal of Psychiatry 204: 316–321. pmid:24311549
  36. 36. Chalmers I, Bracken MB, Djulbegovic B, Garattini S, Grant J, Gülmezoglu AM, et al. How to increase value and reduce waste when research priorities are set. The Lancet 383: 156–165.
  37. 37. Davidoff F, Batalden P, Stevens D, Ogrinc G, Mooney S (2008) Publication guidelines for quality improvement in health care: evolution of the SQUIRE project. Qual Saf Health Care 17 Suppl 1: i3–9. pmid:18836063
  38. 38. Ogrinc G, Mooney SE, Estrada C, Foster T, Goldmann D, Hall LW, et al. (2008) The SQUIRE (Standards for QUality Improvement Reporting Excellence) guidelines for quality improvement reporting: explanation and elaboration. Qual Saf Health Care 17 Suppl 1: i13–32. pmid:18836062
  39. 39. Stein KF (2010) Quality improvement research: Utilization of the SQUIRE Guidelines. Journal of the American Psychiatric Nurses Association 16: 337–338.
  40. 40. Flottorp SA, Oxman AD, Krause J, Musila NR, Wensing M, Godycki-Cwirko M, et al. (2013) A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci 8: 35. pmid:23522377
  41. 41. Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. (2014) Better reporting of interventions: Template for intervention description and replication (TIDieR) checklist and guide. BMJ (Online) 348.
  42. 42. Albrecht L, Archibald M, Arseneau D, Scott SD (2013) Development of a checklist to assess the quality of reporting of knowledge translation interventions using the Workgroup for Intervention Development and Evaluation Research (WIDER) recommendations. Implement Sci 8: 52. pmid:23680355
  43. 43. Abraham C, Johnson BT, de Bruin M, Luszczynska A (2014) Enhancing reporting of behavior change intervention evaluations. J Acquir Immune Defic Syndr 66 Suppl 3: S293–299. pmid:25007199
  44. 44. Altman DG, Schulz KF, Moher D, Egger M, Davidoff F, Elbourne D, et al. (2001) The revised CONSORT statement for reporting randomized trials: explanation and elaboration. Ann Intern Med 134: 663–694. pmid:11304107
  45. 45. Moher D, Hopewell S, Schulz KF, Montori V, Gotzsche PC, Devereaux PJ, et al. (2010) CONSORT 2010 Explanation and Elaboration: Updated guidelines for reporting parallel group randomised trials. J Clin Epidemiol 63: e1–37. pmid:20346624
  46. 46. Moher D, Schulz KF, Altman D (2001) The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomized trials. Jama 285: 1987–1991. pmid:11308435
  47. 47. Schulz KF, Altman DG, Moher D (2010) CONSORT 2010 Statement: Updated guidelines for reporting parallel group randomised trials. Trials 11.
  48. 48. Egger M, Jüni P, Bartlett C (2001) Value of flow diagrams in reports of randomized controlled trials. Journal of the American Medical Association 285: 1996–1999. pmid:11308437
  49. 49. Altman DG, Moher D, Schulz KF (2012) Improving the reporting of randomised trials: The CONSORT Statement and beyond. Statistics in Medicine 31: 2985–2997. pmid:22903776
  50. 50. Boutron I, Moher D, Altman DG, Schulz KF, Ravaud P (2008) Methods and processes of the CONSORT group: Example of an extension for trials assessing nonpharmacologic treatments. Annals of Internal Medicine 148: W-60–W-66. pmid:18283201
  51. 51. Eysenbach G (2011) CONSORT-EHEALTH: Improving and standardizing evaluation reports of web-based and mobile health interventions. Journal of Medical Internet Research 13: 25–34.
  52. 52. Baker TB, Gustafson DH, Shaw B, Hawkins R, Pingree S, Roberts L, et al. (2010) Relevance of CONSORT reporting criteria for research on eHealth interventions. Patient Educ Couns 81 Suppl: S77–86. pmid:20843621
  53. 53. Zwarenstein M, Treweek S, Gagnier JJ, Altman DG, Tunis S, Haynes B, et al. (2008) Improving the reporting of pragmatic trials: an extension of the CONSORT statement. Bmj 337: a2390. pmid:19001484
  54. 54. Staniszewska S, Brett J, Mockford C, Barber R (2011) The GRIPP checklist: strengthening the quality of patient and public involvement reporting in research. Int J Technol Assess Health Care 27: 391–399. pmid:22004782
  55. 55. Moher D, Liberati A, Tetzlaff J, Altman DG (2009) Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med 151: 264–269, w264. pmid:19622511
  56. 56. Welch V, Petticrew M, Tugwell P, Moher D, O'Neill J, Waters E, et al. (2012) PRISMA-Equity 2012 Extension: Reporting Guidelines for Systematic Reviews with a Focus on Health Equity. PLoS Medicine 9.
  57. 57. Welch V, Petticrew M, Ueffing E, Benkhalti Jandu M, Brand K, Dhaliwal B, et al. (2012) Does consideration and assessment of effects on health equity affect the conclusions of systematic reviews? A methodology study. PLoS One 7: e31360. pmid:22427804
  58. 58. Welch VA, Petticrew M, O'Neill J, Waters E, Armstrong R, Bhutta ZA, et al. (2013) Health equity: evidence synthesis and knowledge translation methods. Systematic reviews 2: 43. pmid:23799964
  59. 59. Burford BJ, Welch V, Waters E, Tugwell P, Moher D, O'Neil J, et al. (2013) Testing the PRISMA-Equity 2012 Reporting Guideline: The Perspectives of Systematic Review Authors. PLoS ONE 8.
  60. 60. Proudfoot J, Klein B, Barak A, Carlbring P, Cuijpers P, Lange A, et al. (2011) Establishing guidelines for executing and reporting internet intervention research. Cognitive Behaviour Therapy 40: 82–97. pmid:25155812
  61. 61. Fernald D, Harris A, Deaton EA, Weister V, Pray S, Baumann C, et al. (2012) A standardized reporting system for assessment of diverse public health programs. Preventing Chronic Disease 9.
  62. 62. Rigby M, Talmon J, Brender J, Ammenwerth E, De Keizer NF, Nykanen P (2009) Linking informaticians and end users—Using the STARE-HI evaluation reporting framework as a unifying design approach. In: Adlassnig KP, editor. Medical Informations in a United and Healthy Europe: 2009 European Federation for Medical Informatics. pp. 66–70.
  63. 63. Talmon J, Ammenwerth E, Brender J, de Keizer N, Nykanen P, Rigby M (2009) STARE-HI—Statement on reporting of evaluation studies in Health Informatics. Int J Med Inform 78: 1–9. pmid:18930696
  64. 64. Wells GA, Shea B, Higgins JPT, Sterne J, Tugwell P, Reeves BC (2013) Checklists of methodological issues for review authors to consider when including non-randomized studies in systematic reviews. Research Synthesis Methods 4: 63–77. pmid:26053540
  65. 65. Husereau D, Drummond M, Petrou S, Carswell C, Moher D, Greenberg D, et al. (2013) Consolidated Health Economic Evaluation Reporting Standards (CHEERS)—explanation and elaboration: a report of the ISPOR Health Economic Evaluation Publication Guidelines Good Reporting Practices Task Force. Value Health 16: 231–250. pmid:23538175
  66. 66. Husereau D, Drummond M, Petrou S, Carswell C, Moher D, Greenberg D, et al. (2013) Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. Value Health 16: e1–5. pmid:23538200
  67. 67. Boutron I, Moher D, Tugwell P, Giraudeau B, Poiraudeau S, Nizard R, et al. (2005) A checklist to evaluate a report of a nonpharmacological trial (CLEAR NPT) was developed using consensus. J Clin Epidemiol 58: 1233–1240. pmid:16291467
  68. 68. Tong A, Flemming K, McInnes E, Oliver S, Craig J (2012) Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Medical Research Methodology 12.
  69. 69. Ramsey S, Willke R, Briggs A, Brown R, Buxton M, Chawla A, et al. (2005) Good Research Practices for Cost-Effectiveness Analysis Alongside Clinical Trials: The ISPOR RCT-CEA Task Force Report. Value in Health 8: 521–533. pmid:16176491
  70. 70. Tooth L, Ware R, Bain C, Purdie DM, Dobson A (2005) Quality of reporting of observational longitudinal research. Am J Epidemiol 161: 280–288. pmid:15671260
  71. 71. Bennett C, Khangura S, Brehaut JC, Graham ID, Moher D, Potter BK, et al. (2010) Reporting guidelines for survey research: an analysis of published guidance and reporting practices. PLoS Med 8: e1001069. pmid:21829330
  72. 72. Niazkhani Z, Pirnejad H, Aarts J, Adams S, Bal R (2011) Reporting qualitative research in health informatics: REQ-HI recommendations. Stud Health Technol Inform 169: 877–881. pmid:21893872
  73. 73. Matsumoto M, Bowman R, Worley P (2012) A guide to reporting studies in rural and remote health. Rural Remote Health 12: 2312. pmid:22950574
  74. 74. Bossuyt PM, Reitsma JB, Bruns DE, Gatsonis CA, Glasziou PP, Irwig LM, et al. (2003) Towards complete and accurate reporting of studies of diagnostic accuracy: The STARD Initiative. Ann Intern Med 138: 40–44. pmid:12513043
  75. 75. Bossuyt PM, Reitsma JB, Bruns DE, Gatsonis CA, Glasziou PP, Irwig LM, et al. (2003) The STARD statement for reporting studies of diagnostic accuracy: explanation and elaboration. Ann Intern Med 138: W1–12. pmid:12513067
  76. 76. Simel DL, Rennie D, Bossuyt PM (2008) The STARD statement for reporting diagnostic accuracy studies: application to the history and physical examination. J Gen Intern Med 23: 768–774. pmid:18347878
  77. 77. Smidt N, Rutjes AW, van der Windt DA, Ostelo RW, Bossuyt PM, Reitsma JB, et al. (2006) Reproducibility of the STARD checklist: an instrument to assess the quality of reporting of diagnostic accuracy studies. BMC Med Res Methodol 6: 12. pmid:16539705
  78. 78. Little J, Higgins JP, Ioannidis JP, Moher D, Gagnon F, von Elm E, et al. (2009) Strengthening the reporting of genetic association studies (STREGA): an extension of the strengthening the reporting of observational studies in epidemiology (STROBE) statement. J Clin Epidemiol 62: 597–608.e594. pmid:19217256
  79. 79. MacPherson H, Altman DG, Hammerschlag R, Li Y, Wu T, White A, et al. (2010) Revised STandards for Reporting Interventions in Clinical Trials of Acupuncture (STRICTA): extending the CONSORT statement. Acupunct Med 28: 83–93. pmid:20615861
  80. 80. MacPherson H, White A, Cummings M, Jobst K, Rose K, Niemtzow R (2002) Standards for reporting interventions in controlled trials of acupuncture: The STRICTA recommendations. STandards for Reporting Interventions in Controlled Trails of Acupuncture. Acupunct Med 20: 22–25. pmid:11926601
  81. 81. Vandenbroucke JP, von Elm E, Altman DG, Gotzsche PC, Mulrow CD, Pocock SJ, et al. (2007) Strengthening the Reporting of Observational Studies in Epidemiology (STROBE): explanation and elaboration. PLoS Med 4: e297. pmid:17941715
  82. 82. von Elm E, Altman DG, Egger M, Pocock SJ, Gotzsche PC, Vandenbroucke JP, et al. (2007) The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. PLoS Med 4: e296. pmid:17941714
  83. 83. MacPherson H, Jobst KA (2010) Improving the Reporting of Interventions in Clinical Trials of Acupuncture: The Updated and Revised STRICTA. Journal of Alternative and Complementary Medicine 16: 929–930.
  84. 84. Pinnock H, Epiphaniou E, Sheikh A, Griffiths C, Eldridge S, Craig P, et al. (2015) Developing standards for reporting implementation studies of complex interventions (StaRI): a systematic review and e-Delphi. Implementation Science 10: 42. pmid:25888928
  85. 85. WHO, ExpandNet (2011) Beginning with the end in mind. Planning pilot projects and other programmatic research for succesful scaling up. Geneva: WHO.
  86. 86. The SURE Collaboration (2011) SURE Guides for preparing and using evidence-based policy briefs. Available: http://global.evipnet.org/SURE-Guides/