In an earlier post I highlighted the editorial ‘All in the Family: systematic reviews, rapid reviews, scoping reviews, realist reviews, and more’. In the editorial the authors raised the issue of systematic reviews being a family and rapid, systematic, scoping were all part of the ‘evidence synthesis’ family.
I contacted the authors with a couple of questions/observations:
- They highlighted that systematic reviews used systematic methods which could be tested in a scientific way. I pointed out that rapid reviews can follow methods which can also be tested.
- I then highlighted the problems with terms such as systematic and rapid reviews – they’re not helpful for consumers of the reviews and I was wondering if their might be better ways to describe them (the different review methods).
I have received a response, and with permission from the author of the response (Paul Shekelle), I am posting it here:
“My own view is that a rapid review is a species to the genus systematic review.
I don’t view Cochrane as a “gold standard” for a systematic review. I do view the work done back in the 1990’s by people, many of them affiliated with Cochrane, to be the “scientific method” for establishing what are the benefits of different types of methods for searching, evaluating, and synthesizing evidence – Kay Dickersin, David, Alejandro Jadad, Alessandro Liberati, Doug Altman, etc. I do view the widespread adoption of certain methods as “must-do” elements of the “gold standard” systematic review as misguided – things became dogma before their benefit-to-resource value was well established, e.g. routine “must-do” searching of non-English language literature, of EMBASE, etc. etc., and then the standardization of tools for RoB before establishment of whether the questions were being asked were psychometrically sound (e.g., reproducible), let alone associated with any evidence of bias in the results. What became the “must-do” things for a systematic review ended up making the systematic review take a long time and cost a lot of money. Ergo, this created an environmental niche that was unoccupied, and the rapid review evolved from the systematic review to fill the niche (an example of sympatric speciation). but a rapid review still needs to be systematic, in that it has a method of some known property in terms of the tradeoff between rapidity and validity.
So – perhaps this “thing” should have been called a ‘scientific review’ from the beginning (back in 1990), but I am sure that would have been rejected because the authors of narrative reviews consider themselves scientific. so “systematic” it is. But I don’t view “systematic” as meaning that a protocol must be filed, MEDLINE and EMBASE must be searched back to the beginning of time, gray literature must be searched, all non-English language articles must be included, all prior reviews must be reference mined, contacting original authors must be made to gather additional information on included studies, all studies must be assessed using the Cochrane RoB tool, that the Hartung-Knapp estimator must be used instead of the DerSimonian and Laird estimator or instead of a fixed effects model with no estimator, etc. etc. etc. I view “systematic” meaning that there is a system – a Methods section – that describes what was done and what was not done, and readers can then determine for themselves whether it is justified to leave the gray literature out, or to have used a fixed effects model, etc. etc. and whether the lack of a protocol increases the concern about cherry picking of outcomes, etc. And in that worldview, a rapid review is, or should be, a systematic review.”
I loved the response and I’m gratefully for the time spent crafting it. A few observations spring from it, namely:
- The use of systematic and reproducible methods are vitally important – for whatever form of evidence synthesis you do.
- If you see systematic reviews as being a family then a better term for a rapid review – assuming you use a systematic method – is a rapid systematic review.
- I loved the notion of a niche. As ‘mainstream’ systematic reviews get more and more complex, costly and – frankly – not fit for purpose, they have left a gap for a form of evidence synthesis that is more aligned to everyday needs – hence the increasing interest in rapid methods. I’ve mentioned previously that the recent developments in the mainstream systematic review world risk the longevity of organisations such as Cochrane. The term extinction fits in well here!