Show simple item record

dc.contributor.authorChristie, A.P.
dc.contributor.authorAmano, T.
dc.contributor.authorMartin, P.A.
dc.contributor.authorShackelford, G.E.
dc.contributor.authorSimmons, B.I.
dc.contributor.authorSutherland, W.J.
dc.date.accessioned2024-08-19T07:22:51Z
dc.date.available2024-08-19T07:22:51Z
dc.date.issued2022-05-01
dc.identifier.citationJournal of Applied Ecology: 59 (5): 1191-1197 (2022)es_ES
dc.identifier.urihttp://hdl.handle.net/10810/69302
dc.description.abstractIn Christie et al. (2019), we used simulations to quantitatively compare the bias of commonly used study designs in ecology and conservation. Based on these simulations, we proposed ‘accuracy weights’ as a potential way to account for study design validity in meta-analytic weighting methods. Pescott and Stewart (2022) raised concerns that these weights may not be generalisable and still lead to biased meta-estimates. Here we respond to their concerns and demonstrate why developing alternative weighting methods is key to the future of evidence synthesis. We acknowledge that our simple simulation unfairly penalised randomised controlled trial (RCT) relative to before-after control-impact (BACI) designs as we considered that the parallel trends assumption held for BACI designs. We point to an empirical follow-up study in which we more fairly quantify differences in biases between different study designs. However, we stand by our main findings that before-after (BA), control-impact (CI) and after designs are quantifiably more biased than BACI and RCT designs. We also emphasise that our ‘accuracy weighting’ method was preliminary and welcome future research to incorporate more dimensions of study quality. We further show that over a decade of advances in quality effect modelling, which Pescott and Stewart (2022) omit, highlights the importance of research such as ours in better understanding how to quantitatively integrate data on study quality directly into meta-analyses. We further argue that the traditional methods advocated for by Pescott and Stewart (2022; e.g. manual risk-of-bias assessments and inverse-variance weighting) are subjective, wasteful and potentially biased themselves. They also lack scalability for use in large syntheses that keep up-to-date with the rapidly growing scientific literature. Synthesis and applications. We suggest, contrary to Pescott and Stewart's narrative, that moving towards alternative weighting methods is key to future-proofing evidence synthesis through greater automation, flexibility and updating to respond to decision-makers' needs—particularly in crisis disciplines in conservation science where problematic biases and variability exist in study designs, contexts and metrics used. While we must be cautious to avoid misinforming decision-makers, this should not stop us investigating alternative weighting methods that integrate study quality data directly into meta-analyses. To reliably and pragmatically inform decision-makers with science, we need efficient, scalable, readily automated and feasible methods to appraise and weight studies to produce large-scale living syntheses of the future. © 2022 British Ecological Society.es_ES
dc.description.sponsorshipAuthor funding sources: T.A. was supported by the Grantham Foundation for the Protection of the Environment, Kenneth Miller Trust and Australian Research Council Future Fellowship (FT180100354); W.J.S., P.A.M. and G.E.S. were supported by Arcadia and The David and Claudia Harding Foundation; B.I.S. and A.P.C. were supported by the Natural Environment Research Council via Cambridge Earth System Science NERC DTP (NE/L002507/1, NE/S001395/1); and BIS was supported by the Royal Commission for the Exhibition of 1851 Research Fellowship.es_ES
dc.language.isoenges_ES
dc.publisherJournal of Applied Ecologyes_ES
dc.rightsinfo:eu-repo/semantics/embargoedAccesses_ES
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/3.0/es/*
dc.subjectautomationes_ES
dc.subjectbias adjustmentes_ES
dc.subjectcritical appraisales_ES
dc.subjectdynamic meta-analyseses_ES
dc.subjectevidence synthesises_ES
dc.subjectliving reviewses_ES
dc.subjectquality effects modellinges_ES
dc.subjectrisk of biases_ES
dc.titleInnovation and forward-thinking are needed to improve traditional synthesis methods: A response to Pescott and Stewartes_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.rights.holder© 2022 British Ecological Society.es_ES
dc.rights.holderAtribución-NoComercial-CompartirIgual 3.0 España*
dc.relation.publisherversionhttps://dx.doi.org/10.1111/1365-2664.14154es_ES
dc.identifier.doi10.1111/1365-2664.14154


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

© 2022 British Ecological Society.
Except where otherwise noted, this item's license is described as © 2022 British Ecological Society.