Quality of conduct and reporting in rapid reviews: an exploration of compliance with PRISMA and AMSTAR guidelines. Kelly SE et al. Systematic Reviews 2016 5:79
This is an important issue in relation to rapid reviews and is connected with issues that are consistently raised in relation to rapid reviews, that of reproducibility and transparency. The objective of this study was to explore compliance with conduct and reporting guidelines in rapid reviews published or posted online during 2013 and 2014.
AMSTAR = A Measurement Tool to Assess Systematic Reviews
PRISMA = Preferred Reporting Items for Systematic Reviews and Meta-Analyses
The authors report, in relation to the checklists that they “… are reliable and practical instruments designed to help end-users discriminate between systematic reviews with a focus on quality of reporting and conduct [9, 10]. Both have become widely accepted by publishing agencies and evidence producers since 2011. Given the aim of rapid reviews to optimize to the extent possible a systematic process while synthesizing evidence and balancing the timely requirements of healthcare decision-making, it is feasible that these tools could also be applied to rapid reviews.”
They later report that compliance was poor, concluding:
“Transparency and inadequate reporting are significant limitations of rapid reviews. Scientific editors, authors and producing agencies should ensure that the reporting of conduct and findings is accurate and complete. Further research may be warranted to explore reporting and conduct guidelines specific to rapid reviews and how these guidelines may be applied across the spectrum of rapid review approaches.”
I was taken with this diagram:
Which highlights – to me – a distinct pattern of reporting quality across the AMSTAR checklist. Surely areas to improve on for rapid review producers. Below is a similar one for PRISMA.
The authors express concern with the situation and I have some sympathy for this but I also want to gently challenge it on two fronts:
- Rapid reviews are evolving much like systematic reviews were when they first reached a critical mass (say 20 years ago) so it seems a tad unfair to expect rapid reviews to be up to speed with various checklists.
- The whole point of a rapid review is that it’s rapid. The more bureaucracy the slower it is.
This paper is really important as it hammers home the message about transparency which I think is a really important issue that rapid methods need to be acutely aware of as they develop.