Unpublished studies in stem cells

I spotted an interesting tweet earlier and replied, the exchange is below:

The paper in question is: Responsible Translation of Stem Cell Research: An Assessment of Clinical Trial Registration and Publications.

For fear of being repetitive reporting bias is hugely problematic.  Avoiding unpublished trials can massively affect a systematic review [1, 2]. Yet Cochrane, arguably the ‘gold standard’ for systematic review production, has an unsystematic approach to unpublished studies [3]. Other groups are likely to be no better and probably worse.  I don’t think it’s particularly controversial to suggest that most systematic reviews are really poor at using unpublished studies.  This recent study [4] adds further weight to the argument.

I have said time and again that systematic reviews, based on published journal articles, can only be relied on to give a ballpark figure.  If you’re happy with a ballpark figure – do it quickly.  It starts to get immoral when you spend so long doing a systematic review (with all the cost, harm etc implications) and then dress it up as something that it’s not i.e. a reliable estimate of the effect of an intervention.

A plea to systematic review producers – be more transparent.  If you don’t believe you have included all unpublished studies be honest and clear and alert the consumer of the information that the results can only be counted on to be ballpark.

NOTE/UPDATE: To be clear my motivation here is to improve the system of evidence synthesis.  If we assume SRs are ‘fine, but take a bit too much time’ we’re not focusing on the key issue.  The key issue is how can we best arrive at reliable answers.  The notion of reliability is relative ranging from low reliability (using a poor method), moderate reliability (where I would class most SRs and probably rapid reviews) and high reliability (using unpublished studies and evidence from more than just journal articles (e.g. Clinical Study Reports).

From a moral perspective, of maximising gain and reducing waste I am yet to see any evidence that SRs based on journal articles are cost-effective.  This is controversial from the perspective of preserving the current status quo.  However, if one steps away it’s an entirely legitmate question to ask.  As scientists no belief/paradigm should be above scepticism.

References

  1. Selective Publication of Antidepressant Trials and Its Influence on Apparent Efficacy Turner EH et al. N Engl J Med. 2008 Jan 17;358(3):252-60
  2. Effect of reporting bias on meta-analyses of drug trials: reanalysis of meta-analyses Hart B et al. BMJ. 2012 Jan 3;344:d7202.
  3. Searching for unpublished data for Cochrane reviews: cross sectional study. Schroll JB et al. BMJ 2013;346:f2231
  4. Grey literature in systematic reviews: a cross-sectional study of the contribution of non-English reports, unpublished studies and dissertations to the results of meta-analyses in child-relevant reviews. Hartling L et al. BMC Medical Research Methodology 2017 17:64

 

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s