Increasing Rigor in Huntington’s Disease Research

By Neuroskeptic | December 14, 2014 6:57 am

The CHDI Foundation, a charitable organization who fund a lot of research into Huntington’s disease, are interested in reforming the scientific process.

The story comes from a paper written by British neuroscientist Marcus Munafo and colleagues (the authors including CHDI staff) published in Nature Biotechnology a couple of months ago: Scientific rigor and the art of motorcycle maintenance.

Chdi-logo

Munafo et al. begin by pointing to the history of car manufacturing as an analogy for the scientific process. They set the scene:

In the 1970s the US automobile industry used production methods that relied upon cars being entirely assembled before checking for obvious defects, which resulted in many faulty cars – ‘lemons’ – rolling off the production line and into showrooms. Cars were built to be repaired later rather than to be reliable from sale.

This was how things had always been done. But then Japanese car manufacturers introduced quality-control procedures throughout the manufacturing process, rather than waiting until the cars were finished. This made their auto industry much more efficient, and it allowed Japan to lead the world automobile market for years. Munafo et al. say that science needs to be more Japanese because:

This is a fitting analogy for the current state of biomedical research, where the low reproducibility of key findings is now being widely discussed. Problems such as publication bias, low statistical power, data fabrication and questionable research practices are not new, but there is increasing concern that their scale has grown as competition for resources has intensified and, consequently, incentive structures have become distorted. Researchers are susceptible to systemic influences, such as the ‘publish or perish’ culture and the propensity for journals to prioritize ‘significant’ novel results… the current scientific career structure works against good scientific practice.

The CHDI, mindful of these concerns, organized a meeting in London in September 2013 to explore these issues. Munafo et al.’s paper is a product of the discussions at that meeting and CHDI is now exploring turning the talk into action by changing their funding policies. For instance, regarding the issue of replication, Munafo et al. say that

CHDI is now considering introducing an option into its research funding agreements to pause publication of selected studies, solicit (and fund) replication through a mutually agreed upon independent laboratory, and then have the original and replication researchers publish jointly with appropriate recognition.

They note that ‘This approach clearly entails substantial cultural change’ but ‘ultimately one hopes this would be seen as beneficial to all interested parties’.

But replication alone is not enough (as I’ve argued myself): without pushing the analogy too far, this is essentially a 1970 US car manufacturer approach that ‘will always be an inefficient, retrospective fix; unless we strive to ensure quality throughout the research process, too many lemons will still be produced.’ CHDI is reportedly ‘looking into ways to provide statistical and methodological training such as developing online Coursera courses that postdoctoral researchers in funded laboratories will have to complete – to develop expertise in future research leaders’.

CHDI may also create a review committee to advise on the scientific and statistical methodology of any newly funded project. Most interestingly, this will be used as a form of preregistration:

CHDI will also create a repository for protocols reviewed by the independent standing committee: upon study completion these will be made publicly available so that research findings can be judged against a priori hypotheses and planned statistical analyses

I think that CHDI’s interest in facilitating rigorous science is fantastic. As I’ve said previously, scientific funding bodies could play a huge role in reforming the scientific process. CHDI’s idea of publishing the protocols of each study to implement preregistration is especially close to my heart.

Perhaps they could go even further. CHDI could require that, upon publication of any funded research, a reference to the preregistered protocol is prominently displayed in the Methods section of all relevant papers. This is standard practice in the world of clinical trials to ensure that protocols don’t just go unread. CHDI could also encourage researchers to submit their work as Registered Reports.

On the issue of raw data sharing, CHDI have already set up a repository to encourage sharing of the (anonymized) results of CHDI funded research, which is great. But perhaps they could require future grant holders to post their data, by making full payment of the grant conditional on data sharing.

For example, CHDI could withhold paying the final 5 or 10% of any grant money until after the data funded by that grant has been posted. This model has been used by the NIHR HTA panel who make payment conditional upon publishing the results (a measure designed to prevent publication bias.) CHDI could adapt this to require not just publication, but also data sharing. Their line could be: “In paying for this research, what we are paying for is data, and we want it to be open.”

ResearchBlogging.orgMunafo M, Noble S, Browne WJ, Brunner D, Button K, Ferreira J, Holmans P, Langbehn D, Lewis G, Lindquist M, Tilling K, Wagenmakers EJ, & Blumenstein R (2014). Scientific rigor and the art of motorcycle maintenance. Nature Biotechnology, 32 (9), 871-3 PMID: 25203032

ADVERTISEMENT
  • andrewkewley

    The case of preregistered protocols is interesting, given that a deviation from the protocol tends to be the norm (due to unforseen needs), rather than the exception in many biomedical or biochemistry studies.

    It can sometimes be difficult to determine whether these changes were a big deal or not, until replication is attempted.

  • feloniousgrammar

    Sounds like a plan. The Publish or Perish policy is, in my opinion, a creator of a lot of unnecessary noise and is a poor measure for academic value. I read recently about a professor who was not given tenure for this reason, though he had translated Dante’s works into Hungarian (if I remember correctly). The policy seems star-struck and stupid to me, especially since a lot of the best teachers are more concerned with teaching than publishing.

    Preregistration is a measure to keep ’em honest, and independent research is a must. It’s sad that it’s a mistake to trust most studies for many scientists, and the additional skewing provided by journalists is a sad fact.

    As a lay person, I have to rely on people like you, Neuroskeptic; and I’m glad you’re here to help us understand the current pitfalls of scientific research. Skepticism is a healthy and necessary ingredient in most fields, in industry funded research it is critical.

    I do wish our nation would pay for research like it did in the sixties, and that more researchers could spend more time researching and exploring openly, instead of relying so much on blinded studies.

  • http://hormeticminds.blogspot.com/ Chaorder Gradient

    I could see a problem though, that unlike the research pieces, the interchangeable car parts won’t argue back in quality control attempts. No one was staking their career solely on a faulty axle.

NEW ON DISCOVER
OPEN
CITIZEN SCIENCE
ADVERTISEMENT

Neuroskeptic

No brain. No gain.

About Neuroskeptic

Neuroskeptic is a British neuroscientist who takes a skeptical look at his own field, and beyond. His blog offers a look at the latest developments in neuroscience, psychiatry and psychology through a critical lens.

ADVERTISEMENT

See More

@Neuro_Skeptic on Twitter

ADVERTISEMENT

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

Collapse bottom bar
+