A smoldering debate about whether researchers should ever deliberately create superflu strains and other risky germs in the interest of science has flared once again.
Proponents of the work say that in order to protect the public from the next naturally occurring pandemic, they have to understand what risky infectious agents are capable of — and that means altering the microbes in experiments. Critics argue that the knowledge gained from making new strains of these germs isn't worth the risk, because a lab-made pathogen might escape the laboratory and start spreading among people.
Now, as scientists on both sides of the dispute have formed groups that have issued manifestos and amassed lists of supporters, it looks like the prestigious National Academy of Sciences will step in to weigh the risks and benefits.
I don't think we have adequately involved the public so that they understand the possible consequences of mistakes, or errors, or misadventures in performing this kind of science.
A representative of the National Institutes of Health, which funds this research, says that NIH, too, is "giving deep consideration to the many views expressed by various highly respected parties" about the best way forward.
In a recent editorial in " mBio," the journal's editor-in-chief, Dr. Arturo Casadevall, urged his colleagues to "lower the level of rhetoric and focus on the scientific questions at hand."
Scientists have passionate debates all the time, but it's usually about the meaning of some experimental result, says Casadevall, a microbiologist at the Albert Einstein College of Medicine in New York.
"What is different here is that we are facing a set of intangibles," he says. "And because they involve judgment calls at this point, people are often weighing the risks and the benefits very differently."
Dr. David Relman, a microbiologist at Stanford University, thinks the risks of making a new strain of flu virus that has the potential to cause a pandemic are very real.
"I don't think we have adequately involved the public," Relman says, "so that they understand the possible consequences of mistakes, or errors, or misadventures in performing this kind of science — the kinds of consequences that would result in many, many people becoming ill or dying."
These viruses are out there. They cause disease; they have killed many, many people in the past. We bring them to the laboratory to work with them.
Controversial work on lab-altered bird flu was halted for more than a year in a voluntary moratorium, after two labs generated new, more contagious forms of the bird flu virus H5N1. Eventually, after federal officials promised more oversight, the experiments started back up and the controversy quieted down. But key questions were never answered, Relman says.
"One of the big issues that has not been advanced over the last two years is a discussion about whether there are experiments that ought not to be undertaken and, if so, what they look like," he says, noting that scientists keep publishing more studies that involve genetically altered flu viruses. "You know, every time that one of these experiments comes up, it just ups the ante a bit. It creates additional levels of risk that force the question: Do we accept all of this?"
Last month, Relman met in Massachusetts with others who are worried. They formed the Cambridge Working Group and issued a statement saying that researchers should curtail any experiments that would lead to new pathogens with pandemic potential, until there's a better assessment of the dangers and benefits.
By coincidence, they released their official statement just as the public started hearing news reports of various laboratory errors, such as a forgotten vial of smallpox found in an old freezer, and mishaps involving anthrax and bird flu at the Centers for Disease Control and Prevention.
What's more, the unprecedented Ebola outbreak has reminded the public what it looks like when a deadly virus gets out of control.
All of this led a different band of scientists to also form a group — to publicly defend research on dangerous pathogens.
"There are multiple events that have come together in a rather unusual convergence," says Paul Duprex, a microbiologist at Boston University.
He sees the recent reports of lab mistakes as exceptions — they don't mean you should shut down basic science that's essential to protecting public health, he says.
"These viruses are out there. They cause disease; they have killed many, many people in the past," Duprex says. "We bring them to the laboratory to work with them."
Duprex helped form a group that calls itself Scientists for Science. The group's position statement emphasizes that studies on risky germs already are subject to extensive regulations. It says focusing on lab safety is the best defense — not limiting the types of experiments that can be done.
Whenever questions about safety are raised, Duprex says, scientists have one of two options. They can keep their heads down, do their experiments and hope it will all go away. Or, he says, they can proactively engage the public and provide an informed opinion.
His group has taken the latter approach, "because ultimately we're the people working with these things."
Each of these two groups of scientists now has a website, and each website features its own list of more than a hundred supporters, including Nobel Prize winners and other scientific superstars.
One thing that almost everyone seems to agree on is that, to move forward, there needs to be some sort of independent, respected forum for discussing the key issues.
The American Society for Microbiology has called on the prestigious National Academy of Sciences to take the lead. A representative of the Academy says NAS does plan to hold a symposium, later this year. The details are still being worked out.
Tim Donohue, a microbiologist at the University of Wisconsin, Madison who is president of ASM, says a similar kind of debate happened back in the mid-1970s, when brand-new technologies for manipulating DNA forced scientists and the public to tackle thorny questions.
"And I think that is a productive exercise," Donohue says, "to have scientists and the public, sitting around the table, making sure each one understands what the benefits and risks are, and putting in place policies that allow these types of experiments to go on so that they are safe and so that society can benefit from the knowledge and innovation that comes out of that work."
Copyright 2020 NPR. To see more, visit https://www.npr.org.