Fewer, Smarter Trials
April 10, 2008
It is not hard to find technologies that the clinical trial community should, by all rights, use. But doesn’t. For whatever cultural or psychological reason.
Much like a batty collector of small porcelain figurines, ClinPage counts itself oddly pleased to be able to have found yet another type of software that isn’t as widely used as it should be in the pharmaceutical industry. The tool in question facilitates a systematic review.
The systematic review is better known in academia. It’s a rigorous attempt to summarize and aggregate all of the published literature on a given scientific question. The U.S. Agency for Healthcare Research and Quality (AHRQ) is heavily into systematic reviews.
The best known proponent is the Cochrane Collaboration, which publishes a handbook on how to do systematic reviews according to its guidelines.
That company’s primary business is electronic data capture, but TrialStat has history and familiarity with academia. In its spare time, the firm has worked with systematic review users who have helped it amass a 17 million-entry web database of curated citations for systematic reviews. The company’s web portal for these citations has a cleaner, better user interface than PubMed, in our view, and is called ESRNexus. It allows specially formatted citations in the scientific literature to be reused by others.
An intriguing tangent connects electronic data capture (EDC) and software for systematic reviews: both have the potential to reduce transcription errors. Both systems can offer simple validation of data that is an improvement over handwritten or even Microsoft Office-based transfers from one source to another. The result is more accurate and actionable data.
The structured review citations have coding, carefully added by experts, that augments a plain-vanilla literature citation. How? The citation has additional nuggets of information. To see an example, search for “cabri” in publications on ESRNexus, sort the results by relevance, and click on the “View Coded Data” button. Academic researchers, needless to say, are more inclined than industry scientists to “tag” or “comment” on scientific articles in this rigorous, structured manner. But many non-profit users of the SRS software are uploading such citations to the TrialStat system for the benefit of the medical community at large.
TrialStat, meanwhile, recently inked a deal with researchers in Malaysia. Its government, in collaboration with the World Health Organization and researchers at McMaster University, will use SRS for systematic reviews in evidence-based medicine. “The group at McMaster is able to collaborate with the group in Malaysia and ensure they are delivering quality results. That would not have been feasible with any paper-based approach,” says Peter O’Blenis, TrialStat’s VP of product development.
TrialStat also inked a collaborative relationship with the International Society for Pharmacoeconomics and Outcomes Research. As part of its initiative, ISPOR special interest groups and task forces will get free access to SRS used in nonprofit research.
What’s the challenge in a systematic review? Collaboration across geographical boundaries and areas of domain expertise. But simply collecting the references in the literature can be cumbersome, says O’Blenis. Then there’s the matter of adjudicating and exporting them into a publishable or editable form. Much of the industry is still doing this the way term papers were written in the 1950s. If they’re doing systematic reviews at all.
In a typical systematic review, O’Blenis estimates, there may be 500,000 pieces of paper. That is no doubt music to the ears of some paperoholic readers. O’Blenis describes the challenge gently: “Because of the volumes of information you’re dealing with, systematic reviews are very challenging. There are lots of systematic reviews that never get finished or drag on for years.”
One of the basic things the SRS software can do, he says, is facilitate two people looking at the same reference in the scientific literature and a) assessing it, and b) crossing it off their team’s list so that no one else on the project tackles it. “It allows you to reduce the time for data screening and extraction by about 60 percent. It’s an accelerator. All of the data is stored online,” says O’Blenis. “It allows users to eliminate 98 percent of the paper from the process.”
A thorough systematic review, he says, might cost $100,000 in staff time. It could also eliminate the need for some trials. “There have been cases in the systematic review literature where if you look at the literature, that trial was completely unnecessary,” he says. Yes, there are ethical issues around not doing systematic reviews. “You don’t want to subject more patients to a particular experiment than you need to,” he says.
O’Blenis believes the consensus around systematic reviews is clear. “The idea of using it to tune your trial makes a lot of sense,” he says. “The literature that is out there now is strongly advocating that every clinical trial should start with a systematic review. That will help guide the design of your trial. You won’t repeat the mistakes of other trials. You may have a smaller trial because you don’t need as much data to get a full data set.”
We did some clicking around, and learned that if physicians had bothered to do a systematic review of sudden infant death syndrome, they might have nipped it in the bud in the 1970s, not the 1990s. How many deaths could theoretically have been averted? 60,000.
If that’s not enough to get your attention, we’ll note that one peer-reviewed journal, The Lancet, now requires a systematic review as a condition for submitted articles about clinical trials. That reminds us of The Lancet and other journals that mandate registration of trials in clinicaltrials.gov. In another twenty years or so, should regulatory bodies get around to insisting on systematic reviews themselves, you’ll be prepared. Forward-looking CROs, we conjecture, could start offering web-enabled systematic reviews as a dirt-cheap and wise technique to optimize clinical trial design.d9A2t49mkex