Open Access

Much at stake: the importance of training and capacity building for stakeholder engagement in evidence synthesis

  • Jacqualyn Eales1,
  • Neal R. Haddaway2Email author and
  • J. Angus Webb3
Contributed equally
Environmental EvidenceThe official journal of the Collaboration for Environmental Evidence20176:22

https://doi.org/10.1186/s13750-017-0101-3

Received: 30 March 2017

Accepted: 3 August 2017

Published: 4 September 2017

Abstract

Systematic reviews and maps are complex methods for synthesising evidence that involve specialist and resource-intensive activities. Systematic reviewers face challenges when attempting to clearly and precisely communicate their methods to end-users and other stakeholder groups. We propose that these challenges are likely to be a key causal factor in the generally low uptake of systematic reviews and maps by policy and practitioners in environmental science and management. We argue that training and capacity building are inherently important components of systematic reviews and maps for all stakeholders; the reviewers themselves, the end-users of specific reviews, and the broader research and decision-making community. Training can help to build capacity for undertaking reviews and maps, and can help to explain complex methods to stakeholders. Training is important for those wishing to undertake stakeholder engagement activities as part of a review. It allows researchers and decision-makers to critique systematic reviews and maps based on their methods. Finally, training may be necessary to allow reviewers to prepare visualisations and communication media for presenting the findings of systematic reviews and maps. We conclude that a broad approach, by viewing every opportunity of stakeholder engagement as a potential for training and capacity building is appropriate both within a specific review and across reviews as a community of practice in evidence synthesis. We call for systematic reviewers to improve networks across disciplines in relation to training, sharing experiences and course content, and ensuring a consistent approach to capacity building in the conduct and use of evidence syntheses.

Keywords

EducationCommunicationKnowledge exchangeExpertiseEvidence synthesis skillsReview

Background

Systematic review methods were developed within the field of medicine in the 1980s and 1990s [1] in an attempt to improve the evidence base for clinical decision-making. The Cochrane Collaboration was established in 1992 to oversee the production of guidance in systematic review methods and the peer-review and endorsement of systematic review protocols and reports [1]. The methods were subsequently adapted for the field of conservation and environmental management [2], and the Collaboration for Environmental Evidence (CEE) was established in 2008 to coordinate standards for environmental systematic reviews, and has endorsed a number of courses since its establishment (see recent examples in Table 1).
Table 1

Systematic review and map training endorsed by The Collaboration for Environmental Evidence undertaken in 2017 to date

Course title

Course typea

Location

Date

Provider

Systematic review and map methodology

Commissioned

Lund University, Lund, Sweden

16–17th February 2017

Mistra EviEM

Introduction to systematic reviews and map

Commissioned

Pontifical Catholic University of Chile, Santiago, Chile

3rd April 2017

Independent trainers

Systematic review and map methodology

Commissioned

Pontifical Catholic University of Chile, Santiago, Chile

4–5th April 2017

Independent trainers

Systematic review and map methodology

Commissioned

Global Evidence Synthesis Initiative, American University of Beirut, Beirut, Lebanon

1–2nd June

Mistra EviEM

Systematic review and map methodology

Closed

Stockholm Environment Institute, Nairobi, Kenya

12–13th June 2017

Mistra EviEM

Systematic review and map methodology

Closed

Stockholm Environment Institute, Bangkok, Thailand

12–13th June 2017

Mistra EviEM

aOpen courses are those that are arranged by the providers with participation open to the public. Closed courses are those arranged by the providers with participation by invitation only. Commissioned courses are those that are arranged and funded by an external organisation

In order to fully understand or conduct a systematic review or systematic map, reviewer authors, researchers, end-users and decision-makers (hereafter included within the term stakeholders; [3]) require detailed and comprehensive knowledge across a suite of research and communication skills. As this skillset is rare, training is a necessary part of the effort to increase adoption of systematic synthesis methods in environmental science and management. We believe that this current training gap is likely a key factor in the generally low uptake of systematic reviews and maps by policy and practitioners. Indeed, ideas around the use of training have, until now, been rather traditional, considering training as useful purely in capacity building for those wishing to conduct a systematic review or map. Such a limited view of the role of training in increasing both the understanding and use of systematic review methods and results ignores the importance of the need to continually raise awareness about these methods across all stakeholders. To date, the need for innovative and thoughtfully designed training has not been seen as a priority by the evidence synthesis community, and we propose that, although not traditionally thought of as part of stakeholder engagement, training and capacity building are an inherently important component of systematic evidence synthesis.

Currently, guidance from CEE [4] and from the Campbell [5] and Cochrane [6] collaborations does not focus on the importance of training for effective engagement among the different stakeholder groups. This is because such guidance relates to the conduct of single systematic reviews or maps. Whilst training activities may well be linked to a specific review project, a strategic approach to training and capacity building is key to raising awareness and interest, and increasing the uptake of systematic reviews and maps as methods and as a reliable form of evidence in decision-making.

Fundamentally, training and capacity building increase direct and indirect communication among different stakeholder groups engaged with evidence syntheses. The two-way information flow that comes from effective communication can ensure that: an evidence synthesis concentrates on the issues of greatest importance; outputs can be understood by a wider audience; and benefits of evidence-based approaches are clear. These benefits include improved transparency, accountability, and accuracy, and reduced risk in decision-making. These points are all essential for helping to bridge the ‘knowing-doing gap’ that currently prevents the uptake of much applied research in environmental science and conservation [7].

Systematic review and map training challenges

Systematic review and map methods training inherently involves challenges, some of which are particularly apparent when the training is aimed at non-specialists or a non-research focused audience [3]. These challenges include:
  1. 1.

    Explaining complex concepts in lay terms.

     
  2. 2.

    Deciding between overview and methods training.

     
  3. 3.

    Explaining relatively abstract concepts without information overload (e.g. critical appraisal and meta-analysis).

     
  4. 4.

    Determining when systematic review/map methods are appropriate (resources, timelines, staffing, desired output).

     
  5. 5.

    Ensuring that participants appreciate that while robust evidence syntheses require greater resources than informal and ad-hoc reviews, the payoff is in the reliability of results.

     
  6. 6.

    The need for ongoing training as methods develop and improve.

     
  7. 7.

    Making training cost-efficient.

     
  8. 8.

    Tailoring training media to the situation (e.g. workshops or written media).

     
  9. 9.

    Providing continued support for people who are conducting reviews.

     
  10. 10.

    Ensuring an appreciation of the importance of course accreditation by a coordinating body (e.g. CEE).

     

In the following pages, we outline several types of training courses or efforts and how they can address these challenges.

Training providers

Courses accredited by the Collaboration for Environmental Evidence [8] have been written by trainers with experience in stakeholder engagement in evidence syntheses in the environmental sector. They are designed for a non-research focused audience, are updated with new methodological developments as they arise. The Campbell Collaboration provides and approves (primarily methods-focused) courses by affiliated trainers and maintains lists of both Campbell-approved and non-approved courses. These include training offered by the EPPI-Centre of the University College London, ranging from 1-day workshops to a MSc course in systematic reviews for public policy and practice [9]. Since systematic reviews are well-developed in the field of medicine, a wide range of training courses have long been advertised by the Cochrane Collaboration. These include specialised courses, for example, on software to support meta-analysis [10]. Most courses are aimed at a research audience, yet a stakeholder engagement component is not strongly evident. However, a 1-day course focusing on engaging stakeholders and audiences in research was offered by Cochrane Australia in June 2017 [11]. The Cochrane Collaboration offer training via Cochrane groups such as Cochrane South Asia [12], and also advertise training courses provided by affiliate or independent organisations, such as York Health Economics Consortium and academic institutions, such as Columbia University. Despite the wealth and breadth of experience in capacity building and training in all these fields, there has so far been no concerted effort to connect and learn from the expertise in systematic review training across disciplines.

Opportunities to improve stakeholder engagement through training

We identify five broad categories of training across evidence synthesis processes, from question formulation to communication of findings, where training is important for effective two-way communication among the full range of different stakeholder groups (Table 2). We discuss these below.
Table 2

Stakeholder training stages, beneficiaries and descriptions of the purposes of different training opportunities, along with suggestions of suitable training media

Training summary and purpose

Stakeholders engaged in training

Stage of the evidence synthesis process

Suggested training media

1. End-user and public engagement

Providing skills relating to stakeholder analysis, conflict management, and participatory methods

Reviewer

Scoping and question formulation, Communicating outcomes

In person or online (training courses)

2. Systematic review and map methods

In depth methodological training regarding each step of the systematic review and map process; question formulation, scoping, searching, screening, data extraction and coding, critical appraisal, synthesis, report writing

Anything from a basic overview of systematic review and map methods to advanced details on methodology, provided as a valuable transferable skill

Reviewer

Students

Planning (scoping, protocol development) and conduct

Any (not linked to a specific review)

In person or online (training courses)

In person or on-line (presentations, training courses)

3. Preparation of visualisations and communication media

Production of readily digestible data visualisations

Training in development of communication materials tailored for specific stakeholder groups, and in communication skills

Reviewer

Reviewers

Report preparation

Communication

Written (technical summaries), in person or online (workshops, training courses)

Written (technical summaries), in person or online (workshops, training courses)

4. Value of systematic reviews/map methods

Advocacy of systematic review and map methods as a funded activity, source of evidence in decision-making, or research endeavor. Explanation of limitations of traditional reviews relative to systematic review/map methods

Giving an overview of the need for and methods involved in systematic reviews and maps

Giving an overview of the need for and detailed methods involved in systematic reviews and maps

Giving an overview of the methods involved in systematic reviews and maps and how to interpret the review findings

All stakeholders (particularly review funders, prospective reviewers, policy stakeholders and practitioners)

Subject experts, researchers, policy specialists, practitioners, review advisory groups

Peer-reviewers

End-users (policy stakeholders, practitioners)

Any (not linked to a specific review)

Question formulation

Peer-review (protocol and final report)

Communicating outcomes

Written (flyers, factsheets, non-technical summaries), online (websites, videos), in person (presentations, workshops, short courses)

Written (flyers, factsheets, non-technical summaries), in person (workshops), online (websites, videos)

Written (technical summaries)

Written (flyers, factsheets, non-technical summaries), online (websites, videos), in person (presentations, workshops)

5. Technical critique of review methods

Critical research commentaries in the academic literature can raise awareness in the research community regarding misunderstandings about systematic methods

Researcher community

Any (not linked to a specific review)

Written (research articles)

Training reviewers to maximise benefits of stakeholder engagement

End-user and public engagement (point 1 in Table 2)

Reviewers, particularly those new to the methods, often lack sufficient skills to engage effectively with stakeholders. Researchers new to systematic review methods may not appreciate the nuances involved with stakeholder engagement for evidence syntheses (Challenge 5). These include explaining review methods in sufficient but not unnecessary detail (Challenge 1), predicting and managing potential conflicts between different stakeholders, and maintaining interest and enthusiasm throughout the process (Challenge 3) [3].

In addition, engaging with stakeholders is a complex process [3, 13], requiring careful planning to ensure balance and mitigation of any possible bias or undue influence from stakeholders on the systematic review or map [3]. Reviewers may need to undertake training in methods that can help manage stakeholder engagement. In particular, where conflict between different stakeholder groups arises, those facilitating engagement activities may find their role very challenging. Here, training in conflict management may prove useful [3, 13]. However, such training along with carefully planned stakeholder engagement can add significantly to costs, and reviewers must take care to remain within budget (Challenge 7). Due to the ‘hands-on’ nature of stakeholder engagement activities, this type of training is most likely to be effective in person via workshops and training courses (Challenge 8).

Systematic review and map methods (point 2 in Table 2)

Undertaking a systematic review or map is a time-consuming and challenging task that requires a range of specialist skills [46], including searching for evidence [14] and meta-analysis [15]. A systematic review or map should not be undertaken without specialist methods training if review authors wish to produce a reliable synthesis devoid of major limitations or bias [16, 17]. While the major systematic review coordinating bodies have been slow to recognise the benefits of training aimed specifically at stakeholder engagement, as described above, training in the technical aspects of systematic methodology is relatively common (see Box 1 for an example of a recent training course). These are often in the form of capacity-building workshops and training courses [9, 11, 12] that aim to provide a primer for those wishing to conduct an evidence synthesis. Whilst additional support for systematic reviewers is likely to be necessary (Challenge 9), these workshops aim to cover the methodological steps of a review or map in sufficient detail to allow participants to plan and conduct a review for themselves. An additional challenge that networks such as CEE aim to solve through active training working groups is the need for continued training as methods develop over time (Challenge 6). Methodology training is most likely to be effective in person via workshops and courses. Mentoring is also an option, which addresses the challenge of providing continued support throughout the review process.

Box 1. Summary of a CEE-endorsed 2-day methodology workshop in systematic review and map methods at Lund University in February 2017

Length of event:: 

2 days (09:00–17:00)

Description:: 

This workshop aimed to introduce systematic reviewing and systematic mapping as methods for evidence synthesis. Participants were provided with an in-depth understanding of the activities that are necessary to maximise comprehensiveness, transparency, objectivity and reliability throughout the review process. This step-by-step course took time to explain the theory behind each part of the review process, and provides guidance, tips and advice for those wanting to undertake a full systematic review or map

Format:: 

The course took the form of a series of interactive presentations (c. 7 h) and practical exercises (c. 7 h), including examples from recent relevant systematic review and map projects. Participants were encouraged to ask questions, and time was set aside for a question and answer session. Participants were also encouraged to use their own research in practical exercises. The course featured practical sessions run using review the management platform EPPI Reviewer [14]

Audience:: 

PhD students and researchers in the Centre for Environmental and Climate Research, Lund University

Participants/trainers:: 

14/2

Type of course:: 

Commissioned and funded (i.e. directly requested) by a senior researcher at Lund university

Certification:: 

The course was endorsed by CEE, involving submission of presentations, a detailed programme and learning objectives for peer-review by experts in systematic review training. Certificates of completion were provided to participants

Trainers:: 

The course was provided by two experienced systematic reviewers working at a CEE Centre in Stockholm (Mistra EviEM). One of the trainers has extensive experience of providing training in systematic review and map methods

Preparation of visualisations and communication media (point 3 in Table 2)

Systematic reviews and maps often identify large volumes of evidence and must attempt to summarise the collated evidence (in systematic maps [18]) or synthesise the findings of individual studies as a whole (in systematic reviews [4]). In order to make the results readily understandable, review authors often produce summaries that describe the evidence visually (e.g. forest plots, evidence atlases and heat maps [17, 19]). Such visualisations can often be challenging to produce and may require knowledge of specialist software. There may thus be a need for training in techniques and softwares for preparing evidence visualisations. Such training may be effective in written media, but may also lend itself well to pre-recorded videos, online instruction, or in-person workshops.

For end-users who are unfamiliar with long technical documents and even the visualisations described above, additional approaches to presenting the outcomes of systematic reviews and maps are necessary. We recommend that reviewers summarise their work in a variety of media, including technical summaries [e.g. 20], factsheets or policy briefs [e.g. 21], video briefs [e.g. 22], and infographics [e.g. 23]. Producing these summaries requires skills in science communication and media design, and reviewers may therefore benefit from ‘science translation’ (point 3 in Table 2). For example, The American Association for the Advancement of Science coordinates such workshops [24]. Stakeholders and other non-research focused end-users are likely to respond best to presentations in an easily understood format, thus this type of training will help to ensure outputs of evidence syntheses are disseminated widely, understood and used.

Training for stakeholders, education and outreach

Value of systematic reviews/map methods (point 4 in Table 2)

Many stakeholders wish to better understand the purpose and characteristics of systematic reviews and maps, but do not need to be able to conduct a review. In these cases, a basic understanding is likely to be sufficient (Challenge 2). Here, relevant training should provide an understanding of the benefits of systematic methods compared to informal narrative literature reviews, and the importance of the central tenets of comprehensiveness, transparency, repeatability and objectivity [46]. There is a general appreciation for the ‘added value’ associated with reviews that label themselves as ‘systematic’, but there is also a misunderstanding over what is required to make a review reliable [17, 25]. This kind of training would be suitable for potential commissioners of syntheses along with end-users (policy stakeholders and practitioners) wishing to integrate review findings into decision-making processes. Similarly, reviewers may wish to target end-users with specific training efforts in order to maximise the likelihood of use of a reviews findings. Box 2 summarises a recent training event provided to policy advisors forming part of the European Commission’s Science Advisory Mechanism. Such training can help to increase awareness of the limitations of traditional reviews and the benefits of systematic review methods.

Box 2. Details of a recent training event given by Mistra EviEM at the European Commission’s Science Advisory Mechanism (SAM) in Brussels in May 2017

Length of event:: 

2.5 h (09:30–12:00)

Description:: 

This event introduced the work of EviEM to policy stakeholders working within the European Commission. In particular it introduced systematic reviews and maps as rigorous methods for evidence synthesis, along with ways in which attendees to learn more about completed reviews and suggest topics of interest that EviEM can consider as future reviews

Format:: 

This event took the form of a seminar lasting approximately 1 h, followed by a question and answer session

Audience:: 

Policy-makers and science advisors from the SAM and related organisations

Participants/trainers:: 

25/2

Type of course:: 

Invitation-only event funded by Mistra EviEM and coordinated by the SAM

Certification:: 

This event did not receive formal endorsement from CEE

Trainers:: 

This event was provided by representatives of EviEM, the CEE Centre based in Sweden. Both presenters have experience of evidence-based environmental management and conduct/training in systematic reviews and maps

This type of training is also of value to researchers considering whether to undertake a review and to peer-reviewers assessing evidence syntheses submitted to academic journals. Training of stakeholders may encourage some to become actively involved in the production of future evidence syntheses [3, 13], and better enable stakeholders to respond to criticism when advocating integration of evidence synthesis outputs into policy or practice [3].

It may be challenging to provide a basic understanding of systematic methods in terms that do not require extensive background knowledge. Those providing training should carefully consider the tradeoff between simplicity and accuracy, and should also beware of overwhelming stakeholders with too much information at once (see “training challenges” above) [3]. Furthermore, any information provided can be carefully appraised by a communications expert to ensure it is free from complex terms or unnecessary jargon.

Technical critique of review methods (point 5 in Table 2)

Many syntheses call themselves systematic reviews, but fail to meet basic qualifying standards of what is considered to be a systematic review [17] as set out by systematic review coordinating bodies [46]. Training in how to critically appraise reviews can enable stakeholders to highlight common problems with non-systematic reviews. Tools for critical appraisal of reviews have been published for such purposes, for example CEESAT [26], which include assessments of limitations and susceptibility to bias, such as a lack of comprehensiveness and the presence of selection bias and vote-counting [16]. At present, stakeholders may not fully appreciate the potentially fatal characteristics of some non-systematic reviews. Having undertaken training in technical critique of review methods participants can recognise and appreciate reliable reviews, justify the resources needed to obtain a higher level of reliability in reviews that follow systematic principles (Challenge 5), and appreciate the value of endorsing reviews with a coordinating body such as CEE (Challenge 10).

Training students in systematic review methods (point 2 in Table 2)

Training university students (undergraduate and postgraduate) in systematic review or map methods is a vital means of raising awareness and educating future decision-makers and researchers about the benefits of systematic approaches to evidence synthesis (see Box 3). Since students may wish to incorporate systematic review methods in their work it is important to be pragmatic and recognise that systematic reviews or maps may not be appropriate within the restricted timeframes of many students’ secondary research theses (Challenge 4). Training in universities may make use of workshops, taught and self-led courses and online resources [27], and represents a mechanism by which training can be provided without the need for a direct funding source (Challenge 7).

Box 3. Example of previous training for postgraduate students in evidence synthesis, at Bangor University

Length of event:: 

6 weeks, 32 h of contact time

Description:: 

This course introduced systematic reviews and maps alongside evidence-based environmental management from a decision-making perspective. The course focused on the major stages of systematic reviews, providing experience of practical aspects of each step in the methods

Format:: 

The course combined 14 1–2 h lectures with 5 3-h practical exercises. Students were assessed by submitting a systematic review protocol and video policy brief (formative assessment), and by completing a final exam paper (summative assessment)

Audience:: 

Postgraduate students studying environmental science at master’s level at the School of Environment, Natural Resources and Geography at Bangor University between 2009 and 2014

Participants/trainers:: 

10–15/2 plus guest lecturers

Type of course:: 

Closed university course

Certification:: 

This event did not receive formal endorsement from CEE

Trainers:: 

The course was organised and delivered by staff at the Centre for Evidence-Based Conservation

Conclusion

Systematic review and map methods are complex and nuanced means of synthesising the available evidence to improve decision-making. Because of their complexity, training is often needed at various stages of the planning, conduct and communication of reviews. Effective stakeholder engagement is a critical component for the success of systematic reviews and maps [3, 13], but to date, stakeholder engagement and training activities have largely been undertaken independently by the evidence synthesis community, and we believe this constrained thinking has limited the uptake of systematic reviews. We propose that every occasion where reviewers engage with stakeholders should be viewed as a potential training opportunity. This would provide a range of benefits, including raising awareness, acceptance and understanding of systematic reviews. We identify five main areas where training of reviewers and other stakeholders can not only build capacity for systematic review conduct but also provide a range of other benefits from stakeholder engagement.

Finally, there are ongoing efforts to improve networking between systematic review methodologists across disciplines (e.g. the Evidence Synthesis Technology Methods Group [28]). We call for similar efforts to connect those involved with training and systematic reviews across disciplines to share knowledge and experiences, improving our collective understanding of best practices in capacity building and raising awareness in the methods and their integration into decision-making. An evidence synthesis methods group that spans disciplines, including actors from CEE, The Campbell Collaboration, The Cochrane Collaboration, is one such opportunity for networking and collaborative exchange. The increasing level of interest in training in systematic review and map methods (see recent examples in Table 1) suggests that we are at a critical time to consolidate and optimise efforts.

Notes

Declarations

Authors’ contributions

NRH and JE drafted the manuscript. All authors edited the draft. All authors read and approved the final manuscript.

Acknowledgements

The authors thank Mistra EviEM for covering publication fees.

Competing interests

The authors declare that they have no competing interests.

Availability of data and materials

Not applicable.

Consent for publication

Not applicable.

Ethics approval and consent to participate

Not applicable.

Funding

NRH is funded by Mistra EviEM.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Authors’ Affiliations

(1)
(2)
Mistra EviEM, Stockholm Environment Institute
(3)
Department of Infrastructure Engineering, The University of Melbourne

References

  1. Allen C, Richmond K. The cochrane collaboration: international activity within cochrane review groups in the first decade of the twenty-first century. J Evid Based Med. 2011;4(1):2–7.View ArticleGoogle Scholar
  2. Pullin AS, Stewart GB. Guidelines for systematic review in conservation and environmental management. Conserv Biol. 2006;20(6):1647–56.View ArticleGoogle Scholar
  3. Haddaway N, Kohl C, da Silva NR, Schiemann J, Spök A, Stewart R, Sweet J, Wilhelm R. A framework for stakeholder engagement during systematic reviews and maps in environmental management. Environ Evid. 2017;6(1):11.View ArticleGoogle Scholar
  4. CEE. Guidelines for Systematic Review and Evidence Synthesis in Environmental Management. Version 4.2.: The Collaboration for Environmental Evidence; 2013.Google Scholar
  5. The Steering Group of the Campbell Collaboration. Campbell systematic reviews: policies and guidelines Campbell systematic reviews: the Campbell collaboration; 2014.Google Scholar
  6. Higgins JP, Green S. Cochrane handbook for systematic reviews of interventions. New York: Wiley; 2011.Google Scholar
  7. Knight AT, Cowling RM, Rouget M, Balmford A, Lombard AT, Campbell BM. Knowing but not doing: selecting priority conservation areas and the research—implementation gap. Conserv Biol. 2008;22(3):610–7.View ArticleGoogle Scholar
  8. CEE. The Collaboration for Environmental Evidence 2017. http://www.environmentalevidence.org/. Accessed 14 Mar 2017.
  9. EPPI-Centre. Courses and seminars: Social Science Research Unit, UCL Institute of Education; 2016. http://eppi.ioe.ac.uk/cms/Default.aspx?tabid=168. Accessed 10 June 2017.
  10. Cochrane training. Learn how to conduct, edit, and read systematic reviews: Cochrane training; 2010. http://training.cochrane.org/. Accessed 10 June 2017.
  11. Cochrane Australia. Cochrane Australia Learning Week: Cochrane Australia; 2017. http://learningweek.cochrane.org.au/. Accessed 10 June 2017.
  12. Cochrane South Asia. Training: The Cochrane Collaboration; 2017. http://southasia.cochrane.org/training. Accessed 10 June 2017.
  13. Cottrell E, Whitlock E, Kato E, Uhl S, Belinson S, Chang C, Hoomans T, Meltzer D, Noorani H, Robinson K. Defining the benefits of stakeholder engagement in systematic reviews. Report No.: 14-EHC006-EF. Rockville (MD); 2014.Google Scholar
  14. Bayliss HR, Beyer FR. Information retrieval for ecological syntheses. Res Synth Methods. 2015;6(2):136–48.View ArticleGoogle Scholar
  15. Stewart G. Meta-analysis in applied ecology. Biol Lett. 2010;6(1):78–81.View ArticleGoogle Scholar
  16. Haddaway N, Woodcock P, Macura B, Collins A. Making literature reviews more reliable through application of lessons from systematic reviews. Conserv Biol. 2015;29(6):1596–605.View ArticleGoogle Scholar
  17. Haddaway NR, Watson MJ. On the benefits of systematic reviews for wildlife parasitology. Int J Parasitol Parasit Wildlife. 2016;5(2):184–91.View ArticleGoogle Scholar
  18. James KL, Randall NP, Haddaway NR. A methodology for systematic mapping in environmental sciences. Environ Evid. 2016;5(1):7.View ArticleGoogle Scholar
  19. McKinnon MC. Map the evidence. Nature. 2015;528(7581):185.View ArticleGoogle Scholar
  20. Bernes C, Carpenter SR, Gårdmark A, Larsson P, Persson L, Skov C, Speed JDM, Donk EV. Effects of biomanipulation on water quality in eutrophic lakes. Stockholm: Mistra EviEM; 2015. Contract No.: EviEM Summary.Google Scholar
  21. EviEM Mistra. Removal of nitrogen and phosphorus in freshwater wetlands. Stockholm: Mistra EviEM; 2016.Google Scholar
  22. Hammar J. Wetlands as nutrient traps. Mistra EviEM; 2016. p. 4:46 minutes.Google Scholar
  23. Lankow J, Ritchie J, Crooks R. Infographics: the power of visual storytelling. New York: Wiley; 2012.Google Scholar
  24. Center for Public Engagement with Science & Technology. Communicating Science Workshops: American Association for the Advancement of Science (AAAS); 2017. https://www.aaas.org/pes/communicating-science-workshops. Accessed 10 June 2017.
  25. Haddaway NR, Land M, Macura B. A little learning is a dangerous thing: a call for better understanding of the term systematic review. Environ Int. 2016;99:356–60.View ArticleGoogle Scholar
  26. Woodcock P, Pullin AS, Kaiser MJ. Evaluating and improving the reliability of evidence syntheses in conservation and environmental science: a methodology. Biol Conserv. 2014;176:54–62.View ArticleGoogle Scholar
  27. Ryan S, Scott B, Freeman H, Patel D. The virtual university: The internet and resource-based learning: Routledge; 2013.Google Scholar
  28. Evidence Synthesis Technology Methods Group. Evidence Synthesis Technology Methods Group: Evidence Synthesis Technology Methods Group; 2017. https://www.researchgate.net/project/Evidence-Synthesis-Technology-Methods-Group. Accessed 10 June 2017.

Copyright

© The Author(s) 2017