After decades of effort, scientists have managed to create a vaccine that reduces the risk of HIV infection. It is only a partial success, reducing infection by about a third. But it is a breakthrough that looked as if it might never come. Especially since a group of leading HIV researchers made a concerted effort to stop this particular trial from going ahead.
Five years ago, 22 of the brightest names in HIV vaccine research wrote an open letter to the journal Science. The trial, to be carried out in Thailand, would cost $119m and involve two vaccines that had been shown to be useless. It would, they said, be a waste of time and resources. The lead signatory, Dennis Burton, an Aids researcher at the Scripps Research Institute in La Jolla, California, told the Associated Press, "Everything I've seen about the Thai trial suggests that it doesn't have a prayer."
Thankfully, Burton has been proved wrong. But he wasn't the first scientist to try to stop what looked pointless and turned out not to be. The Cambridge bosses of Francis Crick and James Watson told the pair to give up researching the structure of DNA because they had hit so many dead ends. The inventors of the transistor were also ordered to stop; in the end, John Bardeen and Walter Brattain hid their experiments on a trolley in a closet at Bell Labs. When no one was around, they would get it out and kick-start the electronics revolution in secret.
Scientists attempt to scuttle their own ship for good reasons: it is meant to be difficult to do extraordinary research. Science is a consensus about how things work. Making scientific progress means taking that consensus and proving it wrong – which is always going to be a tough task. In many ways, doing new science is like being at a wedding and getting to your feet at when the vicar asks if there is any just cause or impediment. It requires an absurd level of conviction that, despite what everyone else thinks, you really are doing the right thing. Those who have built or subscribed to the consensus are rarely happy about such impudence – and it is their job to try to silence the dissent with experimental evidence.
Given that, you might wonder how science ever progresses. But that is the beauty of the system: unstoppable curiosity, coupled with a sheer bloody-mindedness and rhino-thick skin, can overcome the resistance. The stories of many Nobel laureates are of ridicule and persecution worn down by dogged persistence; the road to Stockholm is lined with jeering colleagues.
In 1970, for instance, a young researcher named Lynn Margulis made an application for funding to the US National Science Foundation. Her research seemed to show that certain organisms could reproduce using genetic material that was not in the nucleus. This broke all the rules of biology, but it opened up an intriguing path. Perhaps cells with nuclei evolved from cells that had none, and had to reproduce in different ways. Margulis's grant proposal was turned down flat, with the additional warning that she should never apply to the NSF for funding again. She pursued the research by other means, and her idea, endosymbiotic theory, is now the standard textbook explanation for the origin of the cell nucleus. It also won her the 1978 Nobel Prize for physiology.
Barbara McClintock can tell a similar story, but had to wait a lot longer for vindication. Her Nobel came in 1983, but she had the dangerous idea that earned it in the 1940s. She had noticed that each cell in a biological organism contained the same set of genes, and yet there were many different kinds of cells. At work in the cells, McClintock suggested, were controlling elements that dictated what kind of cell it would become. No one was having it: her peers were happier to believe the process was random, not controlled. After a decade, McClintock gave up and waited for others to "discover" the controlling elements. When you know you are blowing apart the mainstream view, you "must await the right time for conceptual change", McClintock has said.
In the physical sciences, the ridiculed path to a Nobel prize seems to be rarer, but there are examples. In 1935, the Indian mathematician Subrahmanyan Chandrasekhar proposed that dying stars can collapse in on themselves to become black holes. Arthur Eddington, then president of the Royal Astronomical Society, didn't like Chandrasekhar or his idea, and labelled it as preposterous. Chandrasekhar dedicated the rest of his life to proving Eddington wrong, picking up his Nobel in 1983.
Not everyone survives long enough to overcome the ridicule – or get a Nobel prize. (They are awarded only to the living.) Alfred Wegener, the first person to come up with the idea that the continents might drift, died before he was exonerated. He made the proposal in 1912. Within a few years, entire symposia had been organised around rubbishing this "ridiculous" idea. Wegener was vindicated at a meeting of the Royal Society in 1964. By then, he had been dead 34 years.
Wegener could take comfort in the fact that sometimes even a Nobel prize is not enough to silence your critics. The electrical engineer Hannes Alfven won his Nobel only after a decades-long fight against the physicists whose discipline he was invading. But even after his appearance on the platform in Stockholm, he was often still forced to publish in obscure journals. And when Stanley Prusiner won a Nobel prize in 1997 for showing that a protein lay behind scrapie, CJD and mad cow disease, the award brought out colleagues baying for blood. On the day of the announcement, the head of neuropathology at Yale complained the award would stifle other lines of inquiry – and that a virus would ultimately be shown to be responsible for the diseases. The director of the federal Rocky Mountain Lab in Montana said, "There is no direct proof that a protein alone is the infectious agent (or that a virus isn't involved)."
Having ridiculous ideas, of course, is not always a sign of brilliance. Many good scientists have tried to prove things that seem ridiculous to us now. There are the flat-Earthers and those who believed the Sun orbited the Earth; given limited data, those are not unreasonable beliefs, and the scientific process gradually removed some of them. Spontaneous generation, the idea that snakes could come from horse hairs, mice from cheese and maggots from meat, for example, lasted from Aristotle's time right until the 1720s, when Anton van Leeuwenhoek, the Dutch inventor of the microscope, was still sending observations in support of the idea to the Royal Society.
Then there was the ether, a ghostly fluid that was supposed to fill all of space and provide a medium by which light could travel. In 1887, the scientific world was stunned by an experiment that failed to detect it. At the beginning of the 20th century, Eddington tried to prove that the strength of nature's forces provide a route to a fundamental theory describing the universe. They don't, and Eddington suffered merciless ridicule for his convictions. Hermann Bondi, Thomas Gold and Fred Hoyle's "steady state universe", the idea that the universe has always existed, became mainstream before it was dispelled by the 1963 discovery of the radiation from the big bang.
Does this long swim against the tide – whether ultimately vindicated or not – push a scientist over the edge? Sometimes. The pressure to do something new and different means that science is perhaps the one area where you really "don't have to be mad to work here, but it helps". A somewhat detached personality is not uncommon among scientists, and a few years of relentless persecution or ridicule can certainly be enough to tip an already singular mind into strange behaviours and beliefs. Nobel laureates such as Margulis, Crick, Kary Mullis and Brian Josephson have, between them, voiced support for such scientifically dubious notions such as telepathy, astrology, alien visitations and scepticism over whether HIV causes Aids.
But scientists who are ultimately successful in proving their ridiculous idea to be brilliant can at least get the last laugh. Barry Marshall's 2005 Nobel prize for spotting a link between gastric ulcers and bacterial infections of Helicobacter pylori came after years of opposition because the stomach was considered too acidic an environment for bacteria. Most of Marshall's papers were rejected; even the accepted papers were held up for unaccounted reasons. Eventually, Marshall resorted to infecting himself – he drank a Petri dish full of HP bacteria – to prove his point. When he collected his Nobel prize, the citation referred to a scientist who "with tenacity and a prepared mind challenged prevailing dogmas". Marshall was a little more explicit. "Before finishing I want to acknowledge all those scientists who failed to recognise HP," he said in his acceptance speech. "Without them I would have had a very different career."
Such perverse delight in taking the position of outsider, no matter what derision it brings, is what makes a good scientist great. And, as this week's breakthrough shows, we should be grateful that these masochistic maniacs are out there.
Michael Brooks is a consultant to New Scientist magazine and author of 13 Things That Don't Make Sense (Profile)