OR WAIT 15 SECS
Reanalyzing randomized trials by different methods, including one about scleroderma and another involving fibrositis, has changed their conclusions. The authors of this analysis argue for more openness when reporting trial data.
Ebrahim S, Sohani ZN, Montoya L, et al. Reanalyses of Randomized Clinical Trial DataJAMA (2014) 312:1024-1032. doi:10.1001/jama.2014.9646.
Krumholz HM, Peterson ED. Editorial: Open Access to Clinical Trials DataJAMA. (2014) 312:1002-1003. doi:10.1001/jama.2014.9647
Among the 37 reports in the literature of randomized, controlled trials (RCTs) reanalyzed to verify their conclusions, 13 came to a different conclusion the second time around, this review points out. Two of these involved rheumatic conditions.
A 2001 study found methotrexate (MTX) ineffective for scleroderma. The investigators reanalyzed their data in 2009, and found that it was effective after all. (A randomized, controlled trial of methotrexate versus placebo in early diffuse scleroderma, Arthritis Rheum.  44:1351-1358 and Shifting our thinking about uncommon disease trials: the case of methotrexate in scleroderma.J Rheumatol  36:323-329.) In the second study, the researchers used baysean analysis to overcome the shortcomings of small data sets that plague clinical trials of small diseases.
A 1989 study found that homeopathy was effective for treating fibrositis. The investigators gave their data to a different group, who found that it wasn’t effective. (Effect of homoeopathic treatment on fibrositis Br Med J.  299:365-366 and Re-analysis of clinical trial of homoeopathic treatment in fibrositis.Lancet  336:441-442.) The second analysis separated the composite pain and sleep end points.
By using different statistical or analytical approaches, and different definitions or measurements of outcomes, a third of these 37 reports reached interpretations and conclusions different from those of the original article.
Because of the expense and effort, RCTs are seldom replicated. But increasingly, they are being reanalyzed. This is becoming easier as doctors insist that clinical trial data be placed in the public record.
Every study requires discretionary decisions, the editorial notes, and even highly qualified investigators will make reasonable, different decisions which may result in significantly different conclusions. This can easily be seen in meta-analyses.
Patient-level data on Medtronics’ bone morphogenetic protein 2 trials was reevaluated in parallel by two expert organizations, but their conclusions differed in important ways.
US Food and Drug Administration regulatory filings had a 9% discordance compared with meeting presentations and journals . “Not unexpectedly, all were in the direction favoring the drug,” the editorial observes.
When ClinicalTrials.gov was compared to journal publications, the primary endpoint had changed in 15%, and the primary outcome value (even in some cases the number of deaths) was different in 22%.
The authors are long-time advocates of reporting RCTs, including the raw data, in enough detail to replicate the analysis. They feel that they are winning the medical community over. In their text they respond to objections about confidentiality and use of data by others for commercial gain or for scientific priority.