The CHDI Foundation, a charitable organization who fund a lot of research into Huntington’s disease, are interested in reforming the scientific process.
The story comes from a paper written by British neuroscientist Marcus Munafo and colleagues (the authors including CHDI staff) published in Nature Biotechnology a couple of months ago: Scientific rigor and the art of motorcycle maintenance.
Munafo et al. begin by pointing to the history of car manufacturing as an analogy for the scientific process. They set the scene:
In the 1970s the US automobile industry used production methods that relied upon cars being entirely assembled before checking for obvious defects, which resulted in many faulty cars – ‘lemons’ – rolling off the production line and into showrooms. Cars were built to be repaired later rather than to be reliable from sale.
This was how things had always been done. But then Japanese car manufacturers introduced quality-control procedures throughout the manufacturing process, rather than waiting until the cars were finished. This made their auto industry much more efficient, and it allowed Japan to lead the world automobile market for years. Munafo et al. say that science needs to be more Japanese because:
This is a fitting analogy for the current state of biomedical research, where the low reproducibility of key findings is now being widely discussed. Problems such as publication bias, low statistical power, data fabrication and questionable research practices are not new, but there is increasing concern that their scale has grown as competition for resources has intensified and, consequently, incentive structures have become distorted. Researchers are susceptible to systemic influences, such as the ‘publish or perish’ culture and the propensity for journals to prioritize ‘significant’ novel results… the current scientific career structure works against good scientific practice.
The CHDI, mindful of these concerns, organized a meeting in London in September 2013 to explore these issues. Munafo et al.’s paper is a product of the discussions at that meeting and CHDI is now exploring turning the talk into action by changing their funding policies. For instance, regarding the issue of replication, Munafo et al. say that
CHDI is now considering introducing an option into its research funding agreements to pause publication of selected studies, solicit (and fund) replication through a mutually agreed upon independent laboratory, and then have the original and replication researchers publish jointly with appropriate recognition.
They note that ‘This approach clearly entails substantial cultural change’ but ‘ultimately one hopes this would be seen as beneficial to all interested parties’.
But replication alone is not enough (as I’ve argued myself): without pushing the analogy too far, this is essentially a 1970 US car manufacturer approach that ‘will always be an inefficient, retrospective fix; unless we strive to ensure quality throughout the research process, too many lemons will still be produced.’ CHDI is reportedly ‘looking into ways to provide statistical and methodological training such as developing online Coursera courses that postdoctoral researchers in funded laboratories will have to complete – to develop expertise in future research leaders’.
CHDI may also create a review committee to advise on the scientific and statistical methodology of any newly funded project. Most interestingly, this will be used as a form of preregistration:
CHDI will also create a repository for protocols reviewed by the independent standing committee: upon study completion these will be made publicly available so that research findings can be judged against a priori hypotheses and planned statistical analyses
I think that CHDI’s interest in facilitating rigorous science is fantastic. As I’ve said previously, scientific funding bodies could play a huge role in reforming the scientific process. CHDI’s idea of publishing the protocols of each study to implement preregistration is especially close to my heart.
Perhaps they could go even further. CHDI could require that, upon publication of any funded research, a reference to the preregistered protocol is prominently displayed in the Methods section of all relevant papers. This is standard practice in the world of clinical trials to ensure that protocols don’t just go unread. CHDI could also encourage researchers to submit their work as Registered Reports.
On the issue of raw data sharing, CHDI have already set up a repository to encourage sharing of the (anonymized) results of CHDI funded research, which is great. But perhaps they could require future grant holders to post their data, by making full payment of the grant conditional on data sharing.
For example, CHDI could withhold paying the final 5 or 10% of any grant money until after the data funded by that grant has been posted. This model has been used by the NIHR HTA panel who make payment conditional upon publishing the results (a measure designed to prevent publication bias.) CHDI could adapt this to require not just publication, but also data sharing. Their line could be: “In paying for this research, what we are paying for is data, and we want it to be open.”
Munafo M, Noble S, Browne WJ, Brunner D, Button K, Ferreira J, Holmans P, Langbehn D, Lewis G, Lindquist M, Tilling K, Wagenmakers EJ, & Blumenstein R (2014). Scientific rigor and the art of motorcycle maintenance. Nature Biotechnology, 32 (9), 871-3 PMID: 25203032