Evidence synthesis is a key element of evidence-based medicine. However, it is currently hampered by being labour intensive meaning that many trials are not incorporated into robust evidence syntheses and that many are out of date. To overcome this, a variety of techniques are being explored, including using automation technology. Here, we describe a fully automated evidence synthesis system for intervention studies, one that identifies all the relevant evidence, assesses the evidence for reliability and collates it to estimate the relative effectiveness of an intervention. Techniques used include machine learning, natural language processing and rule-based systems. Results are visualised using modern visualisation techniques. We believe this to be the first, publicly available, automated evidence synthesis system: an evidence mapping tool that synthesises evidence on the fly.
Available as free full-text here.
I’m delighted to get this published. I’m not driven by the ‘publish and perish’ agenda – so tend to not write many articles. But, given the innovative nature of our work, I thought it was important to get this published in a peer-reviewed journal.
I’m also keen to put it out there as a counter-balance to the narrative that reduces ‘rapid reviews’ to ‘expedited systematic reviews’, in other words a systematic review approach just with shortcuts. This process-focused approach has its place but can also be problematic.
My approach is – for a given evidence synthesis – what are we trying to achieve? In other words, what outcome are we aiming for? For instance, is the review being carried out to:
- Provide an accurate assessment of the effectiveness of an intervention? An important component is how accurate does it need to be!
- Provide a comprehensive review of adverse events?
- Provide a list of previous studies to ensure new research is needed?
Once you have clarity of outcome the question – in my eyes – is what is the quickest/cheapest/most efficient way of answering the topic question to the satisfaction of the consumer of the review? This may be an expedited review but it may be met by novel techniques, such as those outlined in the paper. If the only ‘tool in the box’ is an expedited systematic review we’re only partially solving the problem. It’s also so amazingly boring!
Our work is ground-breaking but it’s still a small step. We’ve been using the system and gaining feedback on it for months now and we hope to build on that before the end of the year. Onwards and upwards!