Simulation-based study design accuracy weights are not generalisable and can still lead to biased meta-analytic inference: Comments on Christie et al. 2019

This is a Preprint and has not been peer reviewed. The published version of this Preprint is available: https://doi.org/10.1111/1365-2664.14153. This is version 4 of this Preprint.

This Preprint has no visible version.

Download Preprint
Add a Comment

You must log in to post a comment.


Comments

There are no comments or no comments have been made public for this article.

Downloads

Download Preprint

Authors

Oliver Pescott , Gavin Stewart

Abstract

Variable study quality is a challenge for all the empirical sciences, but perhaps particularly
for disciplines such as ecology where experimentation is frequently hampered by system
complexity, scale, and resourcing. The resulting heterogeneity, and the necessity of
subsequently combining the results of different study designs, is a fundamental issue for
evidence synthesis. We welcome the recognition of this issue by Christie et al. (2019), and
their attempt to provide a generic approach to study quality assessment and meta-analytic
weighting through an extensive simulation study. However, we have reservations about the
true generality and usefulness of their derived study “accuracy weights”. First, the Christie et
al. simulations rely on a single approach to effect size calculation, resulting in the odd
conclusion that BACI designs are superior to RCTs, which are normally considered the gold
standard for causal inference. Second, so-called “study quality” scores have long been
criticised in the epidemiological literature for failing to accurately summarise individual,
study-specific drivers of bias, and have been shown to be likely to retain bias and increase
variance relative to meta-regression approaches that explicitly model such drivers. We
suggest that ecological meta-analysts spend more time critically, and transparently,
appraising actual studies before synthesis, rather than relying on generic weights or weighting
formulas to solve assumed issues; sensitivity analyses and hierarchical meta-regression are
likely to be key tools in this work.

DOI

https://doi.org/10.32942/osf.io/ytq7k

Subjects

Ecology and Evolutionary Biology, Life Sciences, Other Ecology and Evolutionary Biology

Keywords

causal inference, Epidemiology, evidence synthesis, meta-analysis, meta-regression, multi-level modelling, quality scoring, study design

Dates

Published: 2020-07-08 04:51

Last Updated: 2020-10-07 04:46

Older Versions
License

CC-By Attribution-ShareAlike 4.0 International