Malcolm MacIver is a bioengineer at Northwestern University who studies the neural and biomechanical basis of animal intelligence. He also consults for sci-fi films (e.g., Tron Legacy), and was the science advisor for the TV show Caprica.
A few years ago, the world was aflame with fears about the virulent H5N1 avian flu, which infected several hundred people around the world and killed about 300 of them. The virus never acquired the ability to move between people, so it never became the pandemic we feared it might be. But recently virologists have discovered a way to mutate the bird flu virus that makes it more easily transmitted. The results were about to be published in Science and Nature when the U.S. government requested that the scientists and the journal withhold details of the method to make the virus. The journals have agreed to this request. Because the information being withheld is useful to many other scientists, access to the redacted paragraphs will be provided to researchers who pass a vetting process currently being established.
As a scientist, the idea of having any scientific work withheld is one that does not sit well. But then, I work mostly on “basic science,” which is science-speak for “unlikely to matter to anyone in the foreseeable future.” But in one area of work, my lab is developing new propulsion techniques for high-agility underwater robots and sensors that use weak electric fields to “see” in complete darkness or muddy water. This work, like a lot of engineering research, has the potential to be used in machines that harm people. I reassure myself of the morality of my efforts by the length of the chain of causation from my lab to such a device, which doesn’t seem much shorter than the chain for colleagues making better steels or more powerful engines. But having ruminated about my possible involvement with an Empire of Dark Knowledge, here’s my two cents about how to balance the right of free speech and academic freedom with dangerous consequences.
Consider the following thought experiment: suppose there really is a Big Red Button to launch the nukes, one in the U.S., and one in Russia, each currently restricted to their respective heads of government. Launching the nukes will surely result in the devastation of humanity. I’m running for president, and as part of my techno-libertarian ideology, I believe that “technology wants to be free” and I decide to put my money where my slogan is by providing every household in the U.S. with their very own Big Red Button (any resemblance to a real presidential candidate is purely accidental).
If you think this is a good idea, the rest of this post is unlikely to be of interest. But, if you agree that this is an extraordinarily bad idea, then let’s continue.
Now, let’s not be so device-centric. Let’s imagine that instead of a Big Red Button, we have an idea whose implementation is equally fatal to the continuation of humanity. Once again, we should expend no less effort preventing this idea from spreading than we did for the household Big Red Buttons. Our efforts of containment might not work—there’s lots of ways that an idea can escape, from Wikileaks to disgruntled employees—but it would surely be immoral to intentionally publish this lethal idea so that anyone could exact destruction on a vast scale. All efforts to control it and prevent release should be made. If it is a scientific idea, the science should not be published, and society should consider whether continued funding such research is justifiable.
Although it goes against my instinct, as a scientist, to hide any scientific results, I think the preceding logic compels just that at times when scientists generate dangerous knowledge.
Critics of the move to censor the bird flu information say it has already been presented at conferences, and that censoring it will hold back progress on the very science we may need to prevent a future outbreak. I don’t find “the cat’s already out of the bag” argument convincing in this case, since presentation at a conference of specialists is far from putting the result into a paper that can be downloaded anywhere in the world. Carl Zimmer presents a better, though still arguable, case that publishing the entire sequence wouldn’t present an undue risk. But even if our containment of dangerous knowledge is really shoddy, stymying only the Homer Simpsons of the world, it still prevents a large number of Homer Simpsons from committing a “doh” heard around the world.
In respect to concerns about putting brakes on the progress of science, our efforts to contain dangerous information should be proportional to how damaging its release could be. If the idea is literally one which would enable anyone to easily end humanity, then the controls will be very strict. One can imagine the horrifying possibility of having to quarantine the people who have the dangerous information. Clearly, less is called for in this case, since it’s harder to use this information on viruses to do harm. The vetting process that is currently being developed for the bird flu methods will surely not be perfect, but if it makes it harder for malevolent actors to get the information, then it is working to some extent.
So far, so obvious. What may be less obvious is how we should treat the censored scientists, for whom I have great empathy. Given that we may be holding back the success of these scientists for the benefit of society, serious efforts should be made to compensate them in proportion to the harm we are causing their careers. The withholding might have little effect: other bona fide virologists may easily pass the vetting process, and may be able to replicate and learn the methods in the new paper. On the other hand, the effects might be devastating. It usually takes many years to reach a result of the importance that theirs appears to be. It is possible that many exciting avenues building on this result will also have to not be pursued on threat of being similarly censored. They may lose out on a great deal of prestige and funding that would otherwise be theirs.
One form of compensation would give them funding to continue their current work, even though subsequent results may also be selectively withheld from publication. This would be helpful because funding depends on the judgment of your peers, and the blackout may interfere with peers seeing the crucial results. Another form of compensation would be to enable these scientists to modify their research if they wish to. Yet, as any scientist can attest, it is very expensive to change research directions. When a scientist in the life sciences starts their career, they are given what is called a “startup package” of between $500,000 and $2,000,000 to establish their research program. Since we have decided to put fetters on the scientists involved, I would argue that a similar amount or more should be provided to those who change research direction to something less likely to be hidden behind a government cloak.
Image: Flu virion, courtesy of CDC