Skip to content

Advertisement

Open Access

ROSES RepOrting standards for Systematic Evidence Syntheses: pro forma, flow-diagram and descriptive summary of the plan and conduct of environmental systematic reviews and systematic maps

  • Neal R. Haddaway1,
  • Biljana Macura1Email author,
  • Paul Whaley2 and
  • Andrew S. Pullin3
Contributed equally
Environmental EvidenceThe official journal of the Collaboration for Environmental Evidence20187:7

https://doi.org/10.1186/s13750-018-0121-7

Received: 6 September 2017

Accepted: 22 November 2017

Published: 19 March 2018

Abstract

Reliable synthesis of the various rapidly expanding bodies of evidence is vital for the process of evidence-informed decision-making in environmental policy, practice and research. With the rise of evidence-base medicine and increasing numbers of published systematic reviews, criteria for assessing the quality of reporting have been developed. First QUOROM (Lancet 354:1896–1900, 1999) and then PRISMA (Ann Intern Med 151:264, 2009) were developed as reporting guidelines and standards to ensure medical meta-analyses and systematic reviews are reported to a high level of detail. PRISMA is now widely used by a range of journals as a pre-submission checklist. However, due to its development for systematic reviews in healthcare, PRISMA has limited applicability for reviews in conservation and environmental management. We highlight 12 key problems with the application of PRISMA to this field, including an overemphasis on meta-analysis and no consideration for other synthesis methods. We introduce ROSES (RepOrting standards for Systematic Evidence Syntheses), a pro forma and flow diagram designed specifically for systematic reviews and systematic maps in the field of conservation and environmental management. We describe how ROSES solves the problems with PRISMA. We outline the key benefits of our approach to designing ROSES, in particular the level of detail and inclusion of rich guidance statements. We also introduce the extraction of meta-data that describe key aspects of the conduct of the review. Collated together, this summary record can help to facilitate rapid review and appraisal of the conduct of a systematic review or map, potentially speeding up the peer-review process. We present the results of initial road testing of ROSES with systematic review experts, and propose a plan for future development of ROSES.

Keywords

PRISMAQuality appraisalCEESATDARTAMSTARQUORUM

Background

Reliable synthesis of the various rapidly expanding bodies of evidence is vital for the process of evidence-informed decision-making in environmental policy, practice and research [14]. Methods for systematic evidence syntheses (including systematic reviews and maps) are becoming an industry standard for cataloguing, collating and synthesising documented evidence [5]. Systematic reviews and maps are conducted through transparent and repeatable processes, maximising objectivity and attempting to minimise bias throughout the review [6]. Systematic review methods were translated from the field of healthcare to conservation and environmental management in 2006 as a part of emerging ‘evidence-based conservation’ movement [712]. Systematic reviews are frequently used to assess the effectiveness of management interventions or the effect of an anthropogenic action or natural impact [7, 9]. More recently these methods have been used to answer broader questions that deal with complex systems, for example investigating how, and under which conditions, an intervention or an action may have the greatest effect.

In order to increase the value of reviews for policy and practice and to ensure that they comply with established standards and procedures, formal review coordinating bodies have been established across various disciplines, including Cochrane in healthcare, the Campbell Collaboration in social welfare, and the Collaboration for Environmental Evidence (CEE) in conservation and environmental management. The collaborations provide guidance, training, and endorse reviews through their registration and publication [6, 13, 14]. Where in other fields protocols might be published without peer-review (on e.g. protocol repository platforms), registration and peer-review of a CEE protocol is required and is done through the formal CEE editorial process. Endorsed reviews are vetted by methodology experts and can therefore be trusted as more rigorous and thus more reliable. Nevertheless, substandard reviews remain more numerous (see [15, 16]) with flaws in planning and design (e.g. protocol either missing or lacks crucial details), conduct (e.g. non-comprehensive search) and/or reporting (e.g. poor clarity or comprehensiveness in the write-up) [17, 18]. Without transparent reporting, even well-designed reviews will fail to show their methodological strengths, undermining their utility in decision-making contexts [17].

Systematic review methodology was first established in medicine in the 1990 s to support well-informed decision-making for the health sector, initially focusing on synthesising quantitative evidence from randomised control trials [13]. Since then, systematic review methodology has spread across a range of fields, including software engineering, education, social welfare and international development, public and environmental health, and crime and justice [1922], broadening not only the scope of topics but also the methodologies applied. Now, for example, it is standard practice to incorporate observational studies and qualitative research in systematic reviews.

With the rise of evidence-based medicine and increasing numbers of published systematic reviews, criteria for assessing the quality of reporting have been developed. In 1999, as a response to growing evidence of a lack of clarity in reporting of reviews in medicine, an international group of scientists developed a reporting guidance for meta-analyses of randomised trials—the QUOROM (QUality Of Reporting Of Meta-analysis) [23]. To follow methodological changes and conceptual advances, a decade later the QUOROM statement was updated and extended to a new tool that set minimum standards for transparent and complete reporting of systematic reviews and meta-analyses. These updated standards are known as PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) and consists of a 27-item checklist and an easy-to-follow flow-diagram template to demonstrate the stages at which evidence is excluded during the conduct of a systematic review [24]. The PRISMA Statement is accompanied by the PRISMA Explanation and Elaboration document [25]. PRISMA is relevant for reporting of systematic reviews that evaluate randomised trials but also for reviews of non-randomised (observational and diagnostic) studies assessing the benefits and harms of interventions.

PRISMA reporting guidance has been continuously developing (see [26]) and several extensions have been published so far, including PRISMA-Equity [27] an extension for abstracts [28] and a PRISMA for protocols [29, 30].

Along with its use by review authors as a pre-submission checklist, PRISMA is used also by journal editors and peer-reviewers to improve reporting standards across medical and general journals [31]. PRISMA has been widely accepted and endorsed by 5 editorial organisations, including Cochrane and the World Association of Medical Editors, and 180 bio-medical journals [32]. To assure global acceptance, the PRISMA statement has been published in multiple biomedical journals and translated into a number of other languages. Additionally, the checklist and flow diagram have been translated into a number of other languages, including Russian, Japanese and Korean [33]. Recently, as awareness of PRISMA has grown, reviewers have also looked to the PRISMA statement and checklist as a form of guidance. Some 25% of reviews in the field of marine biology were found by O’Leary et al. [34] to refer to PRISMA as guidelines used to structure their conduct. Whilst PRISMA is, strictly speaking, a set of reporting standards and not true systematic review guidance, this demonstrates the appeal of systems like PRISMA in acting not only as a reporting standard but also a primer to systematic review conduct.

PRISMA and environmental reviews

As systematic review methodology has been adapted into a variety of other disciplines, new evidence synthesis methods have been developed and the standard and quality of systematic reviews has improved. These developments and adaptations mean that PRISMA may not be a suitable template for many non-medical systematic reviews. We have identified 12 key issues associated with the application of PRISMA to environmental systematic reviews and maps (see Table 1). As such, environmental synthesists must identify a suitable alternative for PRISMA to facilitate peer-review, ensure adequate reporting of systematic reviews and systematic maps, and to help raise standards in future reviews. We identify four categories of improvements that are necessary: (i) the ability to handle different types of environmental evidence and synthesis methods (including both quantitative and qualitative reviews); (ii) handling novel review outputs (i.e. systematic maps); (iii) consistency with CEE guidelines and adaptation to the field of environmental management and conservation; and, (iv) increasing the required level of detail and extracting key meta-data to facilitate quality appraisal and peer-review.
Table 1

Key problems relating to the application of PRISMA [24] to evidence synthesis in conservation and environmental management

Problem 1:

Does not strictly require a protocol, refers to registration (e.g. PRISMA [24] checklist #5 “Indicate if a review protocol exists, if and where it can be accessed (e.g., Web address), and, if available, provide registration information including registration number”)

Problem 2:

Vagueness hides necessary level of transparency to allow repeatability (e.g. no requirements to provide details of which databases are accessed via Web of Knowledge or Proquest, PRISMA [24] checklist #7)

Problem 3:

Suggested requirements for review conduct are minimal, affecting overall comprehensiveness of the review (e.g. repeatable reporting of search in “at least one database” required only, PRISMA [24] checklist #8)

Problem 4:

Heavy emphasis on meta-analysis (excludes narrative, qualitative and mixed synthesis methods): e.g. PRISMA [24] checklist #13 ‘Summary measures’: “State the principal summary measures (e.g., risk ratio, difference in means)”, PRISMA [24] checklist #14 ‘Synthesis of results’: “Describe the methods of handling data and combining results of studies, if done, including measures of consistency (e.g., I2) for each meta-analysis”, PRISMA [24] checklist #21 ‘Synthesis of results’: “Present results of each meta-analysis done, including confidence intervals and measures of consistency”

Problem 5:

Lost nuance of necessary methodological steps: e.g. lack of consistency and comprehensiveness checking, no requirement to avoid bias caused by reviewing and appraising one’s own research

Problem 6:

Focuses on risk of bias rather than limitations to validity: ignores external validity and limitations that do not affect bias (e.g. PRISMA [24] checklist #15 and #22)

Problem 7:

Focuses on medicine and health topics (e.g. PRISMA [24] checklist #9 ‘Results of individual studies’: “For all outcomes considered (benefits or harms), present, for each study: (a) simple summary data for each intervention group (b) effect estimates and confidence intervals, ideally with a forest plot”)

Problem 8:

Non-matching terminology: e.g. separating ‘screening’ from ‘eligibility’ (see PRISMA [24] flow-diagram)

Problem 9:

Inappropriate use of the term ‘qualitative synthesis’ meaning ‘narrative synthesis’ (see PRISMA Statement [24])

Problem 10:

PRISMA checklist misses vital information from the PRISMA [24] flow-diagram: e.g. exclusions during critical appraisal

Problem 11:

Often misused as methodological rather than reporting guidance [34]

Problem 12:

PRISMA checklist is only useful for authors and editors during submission: no further information provided within the process of completion: it is purely a checklist

Firstly, PRISMA was not designed for reviews that involve narrative, qualitative or mixed methods rather than quantitative methods (e.g. ‘Synthesis of results’ category: “Describe the methods of handling data and combining results of studies, if done, including measures of consistency (e.g., I2) for each meta-analysis”; and ‘Results of individual studies’ category: “For all outcomes considered (benefits or harms), present, for each study: (a) simple summary data for each intervention group (b) effect estimates and confidence intervals, ideally with a forest plot”). There is also a growing community who believe that meta-analysis is not a standalone type of review, but rather a subset of quantitative synthesis tools for combining studies with numerical data aggregatively (and in fact separate from other quantitative synthesis methods, such as meta-regression [35]). This is an important point, since meta-analyses alone are subject to the same biases as traditional literature reviews [36].

Secondly, systematic mapping has emerged as a very popular method for evidence synthesis, as a first step in the evidence synthesis pathway and as a means of highlighting knowledge clusters and gaps [14, 15]. PRISMA cannot be easily adapted for these methods that rely more heavily on the earlier stages of the review process (searching and screening) and the outputs of which are databases of evidence rather than full syntheses of study findings.

Thirdly, we see the need for several improvements and adaptations to PRISMA, some of which are highlighted in Table 1. Standards in systematic reviews have developed considerably in recent years (e.g. [37, 38]), and some of the suggested methods in PRISMA are perhaps no longer seen as true ‘gold standards’.

Finally, we see a timely opportunity to summarise and collect key methodological information during the review submission process that can facilitate appraisal and analysis of review methods, both for individual reviews and for systematic reviews as a whole. Systematic reviews and maps are, by their very nature, lengthy and complex documents that contain a large amount of necessary detail (e.g. 41 pages plus 251 pages across 10 additional files; [39]). As such, it can be challenging to rapidly locate vital methodological and summary details and statistics regarding the conduct of the review.

Whilst checklists are useful for ensuring conformity and completeness, PRISMA-style reporting could also be used for locating important information within a report. Along with a checklist that demonstrates the required information has been included in a report, reviewers could also record and present in a standard form important summary details, such as the number of search results obtained before duplicate removal. We see three major benefits of reporting such summary information along with a review protocol or a report: (1) a summary form would assist in making the review’s transparency more usable: i.e. although the review might be fully transparent, vital methodological information might be held within supplementary information, and finding that detail might take considerable time; (2) such a form would allow a rapid review of the robustness of the review’s methods. This would facilitate peer-review by collating information that can point to potential issues related to rigour or inconsistencies in reporting. For example, a particularly large number of search results identified across a small number of databases suggests an inefficient and non-comprehensive search strategy, as would a very low inclusion rate at title-level screening; (3) this information could facilitate an appraisal of systematic reviews as a corpus, allowing an assessment of standards in the conduct of reviews along with trends in choice of methodology. Examining the way in which systematic reviews have been undertaken is an important process of reflection, and allows the examination of barriers and limitations in how guidance can be practically applied. It also facilitates the improvement of standards over time by providing data which can be used for assessing the effectiveness of quality management processes for research synthesis and determining areas in which review methods may be susceptible to bias.

Aims and objectives

Here we introduce the ROSES pro forma and flow diagram, and summarise the process through which these were drafted and tested. The target audience for this work is: (i) current and future authors of systematic evidence syntheses (reviews); (ii) journals publishing reviews (including their editors and peer-reviewers); (iii) readers of reviews and the wider research community.

The overall aim of the ROSES initiative is to increase and maintain high standards in the conduct of systematic reviews and maps through increased transparency, and to facilitate the quality assurance of systematic reviews and maps. The key attributes of such a pro forma are that it should: (i) ensure all necessary content required by the current and future updates of CEE Guidelines (CEE [6]) is present and described in detail; (ii) prevent time- and resource-consuming bounces of manuscripts before peer-review; (iii) facilitate rapid identification of key conduct-related review information by journal editors, peer-reviewers and readers; and, (iv) raise and maintain high standards in the conduct of systematic reviews and maps through increased transparency.

Methods

Two authors (NRH and BM) critically reviewed the current versions of PRISMA relating to review protocols [30] and final review reports [24]. We identified a series of shortfalls in relation to the field of conservation and environmental science (see Table 1) and proposed solutions to these problems. These solutions involved the addition of information that we felt was lacking, the removal of out-dated or irrelevant information, and the inclusion of standard text on requirements and recommendations from the Collaboration for Environmental Evidence guidance [6] and submission guidelines for authors from the CEE journal Environmental Evidence [40]. The suggested form was reviewed by all four authors and revised according to collated comments. The protocol and review pro forma were then circulated to an external group of six systematic review experts from the field of environmental evidence synthesis. The feedback was collected via semi-structured questionnaire or 30-min interview, subject to expert availability. The pro forma were then revised according to these comments.

Key differences between ROSES and PRISMA

Our proposed pro forma for systematic review and map protocols and reports can be found in Additional files 1, 2, respectively, along with the template flow diagram in Additional file 3. We have also provided an example of a collated meta-data from protocols of a recent systematic review and a map (Additional file 4). We have called this pro forma Reporting of Strategies in Systematic Evidence Syntheses (ROSES). We will now summarise four key features of ROSES (fully outlined in Table 2).
Table 2

Key differences between ROSES and PRISMA in relation to the problems identified with PRISMA (see Table 1)

Difference 1:

Tailored to environmental systematic reviews AND systematic maps (solution to Problem 1, 6, 7, 8)

Difference 2:

Higher standards of reporting (more details requested) in checklist, summary and flow diagram (solution to Problems 2, 5 and 10)

Difference 3:

Higher standards of conduct and clearer when standards not fully met (solution to Problem 3)

Difference 4:

Reduces emphasis on quantitative synthesis (e.g. meta-analysis etc.) that is only reliable when used for appropriate data in a sensible way (solution to Problem 4)

Difference 5:

Accommodate other types of synthesis (narrative and qualitative synthesis) (solution to Problem 4 and 9)

Difference 6:

Consistent and appropriate terminology (e.g. confusion of ‘qualitative synthesis’ and ‘narrative synthesis’ in PRISMA flow chart) (solution to Problem 8 and 9)

Difference 7:

Corrects problems with PRISMA focusing on bias rather than internal and external validity) (solution to Problem 6)

Difference 8:

Provides baseline methodological guidance and suggestions as well as acting as a reporting standard (solution to Problem 11)

Difference 9:

Inclusion of meta-data (solution to Problem 12)

Firstly, ROSES has been adapted specifically for systematic reviews and maps in the field of conservation and environmental management. We have drafted ROSES as experienced systematic review and map authors, as authors of evidence synthesis methodological guidance, as quantitative and qualitative conservation and environmental researchers, and as editors of journals publishing systematic reviews. As a result, we feel that ROSES better reflects the nuances and heterogeneity across the situations in which we work as reviewers than does PRISMA. In particular, we have ensured that ROSES is specifically adapted for a variety of synthesis methods common to the field of environmental research, such that narrative and qualitative syntheses (i.e. synthesis of qualitative data) also benefit from the form.

Secondly, we have significantly increased the level of reporting detail by adding in additional points and providing brief methodological guidance to make it clearer for authors and readers when a systematic review does not include a ‘gold standard’ step, recommended by a review coordinating body such as CEE. For example, bibliographic checking is the process by which the bibliographies of relevant reviews are screened for potentially relevant studies that might have been missed by even the most comprehensive of searches. Additionally, we see the need to promote the conduct of high quality systematic reviews and maps (a recognised problem in the field [41, 42]). It has been noted that PRISMA has been inappropriately referred to as ‘methodological guidance’ in previous systematic reviews [34], despite its brevity and limited detail: most importantly it was designed for assessing standards of reporting NOT review conduct. We have increased functionality of ROSES and included methodological guidance notes to help authors to conduct a high quality review along with high quality reporting. However, ROSES should only be used as a methodological guidance when considered together with the CEE guidelines.

Thirdly, we have departed from the format of PRISMA by not only providing reporting guidance in a checklist form, but also including requirements to record summary information (i.e. meta-data) that describes the key steps in the conduct of reviews. This increased resolution facilitates a rapid understanding of the robustness of the review methods and allows readers and peer-reviewers to identify potential issues with the review that require further investigation.

Fourthly, we have extended the methodological scope to follow trends in methodological development [43] and included reporting and methodological guidance for systematic maps.

Distinction between checklist and meta-data

In practice we see that ROSES should be completed by review authors upon submission to an adopting journal, for example, the CEE journal Environmental Evidence and this should be compulsory. The checklist would demonstrate to peer-reviewers and editorial staff that the protocol or review report includes all the necessary information. This step would save substantial time and resources for the editorial manager, whose role is to ensure that manuscripts meet basic standards before being sent out for peer-review. It could also significantly reduce time for manuscript consideration by the journal, of great benefit to reviewers and their stakeholders. Meta-data would be separated from the checklist upon submission, forming a summary page and a flow diagram (see Additional files 3, 4). These two files could be appended to the submission and sent to peer-reviewers to facilitate peer-review and improve the quality of methodological feedback. The ROSES checklist, flow diagram and summary page could then be published alongside the final protocol and review to give an overview of its rigour.

Digitisation of ROSES

The ROSES pro forma is intended to be completed and submitted along with final protocols and review reports. The ROSES pro forma will be made fully digital and interactive on a web-based platform that would allow for meta-data to be automatically extracted and collated into a summary (similar to the example of collated meta-data provided in Additional file 4) and the flow diagram (see Additional file 3). ROSES pro forma and flow diagram are available via a dedicated online platform that also contains background information and support, along with examples of use.

Benefits of ROSES

We see several benefits of ROSES as the first guidance for transparent reporting of systematic reviews and maps in the field of environmental management and conservation.

ROSES is designed to accommodate the diversity of methods applied to a wide-variety of review subjects. The tool therefore necessarily reflects some of the heterogeneity and inter-disciplinarily of topics within the conservation and environmental management field. We also see no restrictions in application or adaptation of ROSES across other fields with a similar level of complexity of topics and methods.

ROSES focuses on the earlier and middle stages of the review process, i.e. searching, screening, data extraction and critical appraisal, whilst there is limited detail regarding synthesis. This is a necessary aspect of a form that aims to be applicable to a wide variety of synthesis methods. We believe this flexibility is a key strength of ROSES. Synthesis is a highly complex and context specific process in any systematic review, and developing a universally valid reporting standard for all possible forms of synthesis would be unwieldy and impractical. Rather than focus on what has perhaps been the most common form of synthesis across CEE reviews to date (i.e. meta-analysis), we have learnt from our experiences as systematic reviewers and anticipate a growing interest in alternative forms of synthesis, including narrative-only synthesis, framework synthesis, and mixed methods approaches. As such, we hope to produce tailored ‘add on’ reporting standards for key methods as we are able to do so, ensuring that each is produced with care and sufficient expertise to stand the test of time.

In comparison to the existing reporting guidance (e.g. PRISMA), ROSES combines reporting with methodological advice and thus highlights ‘gold standard’ methods to support production of higher quality protocols and reviews. Moreover, ROSES provides detailed and precise instructions with examples for all stages of the review process including planning, conduct and reporting (e.g. “Detail the planned search strategy to be used, including: database names accessed, institutional subscriptions (or date ranges subscribed for each database), search options (e.g. ‘topic words’ or ‘full text’ search facility).”). This level of detail should leave no space for substandard reporting.

Apart from a simple checklist, authors are asked to extract key conduct-related information and submit that information along with their reports. This 1-page overview of the whole review process facilitates rapid editorial and peer-review decisions, allowing for more transparent and constructive feedback to be provided by directly relating comments to specific required steps and necessary information. In addition, the structure provided within the ROSES collated meta-data could help peer-reviewers to get a rapid overview of the methods necessary for a CEE review. As a result of this support, we believe ROSES has the power to decrease the time between submission and final publication.

However, ROSES is considerably longer than the current PRISMA forms and it will, therefore, undoubtedly take more time for authors to complete. Nevertheless, we believe that this time requirement (a matter of several hours) would improve manuscripts, reduce the time requirement during manuscript consideration, and would prevent unnecessary bouncing of manuscripts back to authors where necessary information is lacking. We strongly recommend that authors begin to complete ROSES pro forma as early as possible. This will help to structure the planning and conduct of the review and facilitate submission.

Conclusions

ROSES provides the conservation and environmental management research synthesis community with a detailed set of reporting standards that have been tailored to the field. We have carefully considered systematic mapping, narrative and qualitative synthesis methods when producing ROSES, and we see these developments as being equally important for other fields where reporting standards in systematic reviews are needed. By increasing the resolution of checklist points and providing rich instructional information we have aimed to demonstrate the necessary level of rigour in a ‘gold standard’ review, and assist reviewers in attaining those standards. By splitting the pro forma into a checklist and a collation of meta-data, we hope to facilitate review and appraisal of the methods that have been proposed or used. Finally, extraction and storage of meta-data would allow the research community to examine methods in systematic reviews and maps across the field, helping to develop new methodologies and propose novel best practices for high quality syntheses.

As has been done in the health sector and with PRISMA, there is a need to continually refine reporting standards as methods and terminology evolves (see also an online database for reporting guidelines under development [44]). Additionally, we see a need to develop extensions, just as with PRISMA, to extend reporting guidelines into specific areas such as ROSES extensions for synthesis methods (quantitative, qualitative and mix-methods) or for abstracts. We believe that ROSES will not only benefit the CEE community, but also those who wish to conduct a review in a systematic way but do not have the resources for a full systematic review or map [45].

Notes

Declarations

Authors’ contributions

NH and BM conceived of the idea, developed the pro forma and drafted the manuscript. PW and AP edited the pro forma and manuscript. All authors read and approved the final manuscript.

Acknowledgements

We thank all road testers who volunteered their time to review ROSES and provide constructive feedback: Magnus Land, Gillian Petrokofsky, Barbara Livoreil, Ruth Garside, Christian Kohl and Jacqueline Eales. We would also like to thank two anonymous reviewers for their insightful comments.

Competing interests

The authors declare that they have no competing interests.

Availability of data and materials

Not applicable.

Consent for publication

Not applicable.

Ethics approval and consent to participate

Not applicable.

Funding

NH and BM are funded by Mistra EviEM.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Authors’ Affiliations

(1)
Mistra EviEM, Stockholm Environment Institute, Stockholm, Sweden
(2)
Lancaster Environment Centre, Lancaster University, Lancaster, UK
(3)
Centre for Evidence-Based Conservation, School of Environment, Natural Resources and Geography, Bangor University, Bangor, UK

References

  1. Roberts PD, Stewart GB, Pullin AS. Are review articles a reliable source of evidence to support conservation and environmental management? A comparison with medicine. Biol Conserv. 2006;132:409–23.View ArticleGoogle Scholar
  2. Vandenberg LN, Ågerstrand M, Beronius A, Beausoleil C, Bergman Å, Bero LA, et al. A proposed framework for the systematic review and integrated assessment (SYRINA) of endocrine disrupting chemicals. Environ Heal. 2016;15:74.View ArticleGoogle Scholar
  3. Whaley P, Halsall C, Ågerstrand M, Aiassa E, Benford D, Bilotta G, et al. Implementing systematic review techniques in chemical risk assessment: challenges, opportunities and recommendations. Environ Int. 2016;92–93:556–64.View ArticleGoogle Scholar
  4. Woodruff TJ, Sutton P. The navigation guide systematic review methodology: a rigorous and transparent method for translating environmental health science into better health outcomes. Environ Health Perspect. 2014;122:1007–14.Google Scholar
  5. Haddaway NR, Pullin AS. The policy role of systematic reviews: past, present and future. Springer Sci Rev. 2014;14:179–83.View ArticleGoogle Scholar
  6. CEE (Collaboration for Environmental Evidence). Guidelines for Systematic Review and Evidence Synthesis in Environmental Management. Version 4.2. 2013. http://environmentalevidence.org/wp-content/uploads/2014/06/Review-guidelines-version-4.2-final.pdf.
  7. Pullin AS, Stewart GB. Guidelines for systematic review in conservation and environmental management. Conserv Biol. 2006;20:1647–56.View ArticleGoogle Scholar
  8. Sutherland WJ, Pullin AS, Dolman PM, Knight TM. The need for evidence-based conservation. Trends Ecol Evol. 2004;19:305–8.View ArticleGoogle Scholar
  9. Pullin AS, Knight TM. Doing more good than harm—building an evidence-base for conservation and environmental management. Biol Conserv. 2009;142:931–4.View ArticleGoogle Scholar
  10. Pullin AS, Knight TM. Effectiveness in conservation practice: pointers from medicine and public health. Conserv Biol. 2001;15:50–4.View ArticleGoogle Scholar
  11. Stewart GB, Coles CF, Pullin AS. Applying evidence-based practice in conservation management: lessons from the first systematic review and dissemination projects. Biol Conserv. 2005;126:270–8.View ArticleGoogle Scholar
  12. Fazey I, Salisbury JG, Lindenmayer D, Maindonald J, Douglas RM. Can methods applied in medicine be used to summarize and disseminate conservation research? Environ Conserv. 2004;31:190–8.View ArticleGoogle Scholar
  13. Cochrane. About us. 2017. http://www.cochrane.org/about-us. Accessed 25 July 2017.
  14. The Campbell Collaboration. Vision, mission and principles—the campbell collaboration. 2017. https://www.campbellcollaboration.org/about-campbell/vision-mission-and-principle.html. Accessed 25 July 2017.
  15. Haddaway NR, Land M, Macura B. A little learning is a dangerous thing”: a call for better understanding of the term “systematic review. Environ Int. 2016;99:356–60.View ArticleGoogle Scholar
  16. Haddaway NR. Response to “Collating science-based evidence to inform public opinion on the environmental effects of marine drilling platforms in the Mediterranean Sea”. J Environ Manag. 2017;203:612–4.View ArticleGoogle Scholar
  17. Pussegoda K, Turner L, Garritty C, Mayhew A, Skidmore B, Stevens A, et al. Systematic review adherence to methodological or reporting quality. Syst Rev. 2017;6:131.View ArticleGoogle Scholar
  18. O’Leary BC, Kvist K, Bayliss HR, Derroire G, Healey JR, Hughes K, et al. The reliability of evidence reviews in environmental science and conservation. Environ Sci Policy. 2016;64:75–82.View ArticleGoogle Scholar
  19. The Steering Group of the Campbell Collaboration. Campbell Collaboration Systematic Reviews: policies and guidelines. Campbell Syst Rev. 2015;46:1.Google Scholar
  20. Petticrew M, Roberts H. Systematic reviews in the social sciences: a practical guide. London: Blackwell Publishing Ltd; 2006.View ArticleGoogle Scholar
  21. Kitchenham B. Procedures for performing systematic reviews. Keele: Keele University; 2004. p. 33.Google Scholar
  22. Morgan RL, Thayer KA, Bero L, Bruce N, Falck-Ytter Y, Ghersi D, et al. GRADE: assessing the quality of evidence in environmental and occupational health. Environ Int. 2016;92–93:611–6.View ArticleGoogle Scholar
  23. Moher D, Cook DJ, Eastwood S, Olkin I, Rennie D, Stroup DF. Improving the quality of reports of meta-analyses of randomised controlled trials: the QUOROM statement Quality of Reporting of Meta-analyses. Lancet (Lond, England). 1999;354:1896–900.View ArticleGoogle Scholar
  24. Moher D, Liberati A, Tetzlaff J, Altman D, Prisma Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med. 2009;151:264.View ArticleGoogle Scholar
  25. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Ioannidis JP, Clarke M, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. Ann Intern Med. 2009;151:W65–94.View ArticleGoogle Scholar
  26. PRISMA. Extensions in development. 2015. http://www.prisma-statement.org/Extensions/InDevelopment.aspx. Accessed 26 July 2017.
  27. Welch V, Petticrew M, Tugwell P, Moher D, O’Neill J, Waters E, et al. PRISMA-equity 2012 extension: reporting guidelines for systematic reviews with a focus on health equity. PLoS Med. 2012;9:e1001333.View ArticleGoogle Scholar
  28. Beller EM, Glasziou PP, Altman DG, Hopewell S, Bastian H, Chalmers I, et al. PRISMA for abstracts: reporting systematic reviews in journal and conference abstracts. PLoS Med. 2013;10:e1001419.View ArticleGoogle Scholar
  29. Shamseer L, Moher D, Clarke M, Ghersi D, Liberati A, Petticrew M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ. 2015;349:g7647.View ArticleGoogle Scholar
  30. Moher D, Shamseer L, Clarke M, Ghersi D, Liberati A, Petticrew M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst Rev. 2015;4:1.View ArticleGoogle Scholar
  31. PRISMA. Endorse PRISMA. 2015. http://www.prisma-statement.org/Endorsement/EndorsePRISMA.aspx. Accessed 25 July 2017.
  32. PRISMA. PRISMA endorsers. 2015. http://www.prisma-statement.org/Endorsement/PRISMAEndorsers.aspx#c. Accessed 25 July 2017.
  33. PRISMA. Available translations. 2015. http://www.prisma-statement.org/Translations/Translations.aspx. Accessed 25 July 2017.
  34. O’Leary BC, Bayliss HR, Haddaway NR. Beyond PRISMA: systematic reviews to inform marine science and policy. Mar Policy. 2015;62:261–3.View ArticleGoogle Scholar
  35. O’Mara-Eves A, Thomas J. Ongoing developments in meta-analytic and quantitative synthesis methods: broadening the types of research questions that can be addressed. Rev Educ. 2016;4:5–27.View ArticleGoogle Scholar
  36. Stegenga J. Is meta-analysis the platinum standard of evidence? Stud Hist Philos Biol Biomed Sci. 2011;42:497–507. https://doi.org/10.1016/j.shpsc.2011.07.003 View ArticleGoogle Scholar
  37. Livoreil B, Glanville J, Haddaway NR, Bayliss H, Bethel A, Lachapelle FF, et al. Systematic searching for environmental evidence using multiple tools and sources. Environ Evid. 2017;6:23.View ArticleGoogle Scholar
  38. Baylissa HR, Beyer FR. Information retrieval for ecological syntheses. Res Synth Methods. 2015;6:136–48.View ArticleGoogle Scholar
  39. Pullin AS, Bangpan M, Dalrymple S, Dickson K, Haddaway NR, Healey JR, et al. Human well-being impacts of terrestrial protected areas. Environ Evid. 2013;2:19.View ArticleGoogle Scholar
  40. Environmental Evidence. Preparing your manuscript. 2017. https://environmentalevidencejournal.biomedcentral.com/submission-guidelines/preparing-your-manuscript. Accessed 25 July 2017.
  41. Haddaway NR, Watson MJ. On the benefits of systematic reviews for wildlife parasitology. Int J Parasitol Parasites Wildl. 2016;5:184–91.View ArticleGoogle Scholar
  42. Haddaway NR, Macura B. Species’ traits influenced their response to recent climate change. Nat Clim Change. 2017;7:205.View ArticleGoogle Scholar
  43. James KL, Randall NP, Haddaway NR. A methodology for systematic mapping in environmental sciences. Environ Evid BioMed Central. 2016;5:7.Google Scholar
  44. The EQUATOR network. Reporting guidelines under development. 2017. http://www.equator-network.org/library/reporting-guidelines-under-development/#52. Accessed 25 July 2017.
  45. Haddaway NR, Woodcock P, Macura B, Collins A. Making literature reviews more reliable through application of lessons from systematic reviews. Conserv Biol. 2015;29:1596–605.View ArticleGoogle Scholar

Copyright

© The Author(s) 2018

Advertisement