Quick: which infection kills up to 1.2 million people a year, most of them children, causes 200 million additional episodes of disease a year  and has developed progressive drug-resistance globally—yet somehow is given the boutique designation of being considered an “orphan disease”?
It’s malaria, the mosquito-transmitted disease that causes destruction of human red blood cells leading to fever, anemia, and more than one human death per minute. Malaria, though, is scarcely on the radar of Westerners unless they become international travelers: 90 percent of all cases occur in Africa while most of the remainder is spread in hot, poor countries through Asia and South America.
After decades of no-new-is-no-news, however, there is some new hope for control of malaria. British pharmaceutical behemoth GlaxoSmithKline has announced that its prevention vaccine showed significant benefit in a large trial conducted in 15,000 children in Africa: clinical episodes of disease were reduced by 46 percent. Now, 46 percent may sound like a paltry number, but it translates into hundreds of thousands of lives saved (assuming, of course, the numbers hold up under FDA scrutiny).
The natural history of malaria, from mosquito to disease to treatment, is one of the best and most thoroughly studied in the world—by 18th and 19th century standards. Indeed, malaria had a major impact on the way the world was settled in the 18th and 19th century. People born into malarial regions have a fierce Darwinian pressure placed upon them—the disease affects everyone in the area. Those with immune systems that can’t handle the infection die off as infants or children; yet though many die each year, most people with malaria survive and with an advantage: adult populations who grow up in these areas have sufficient immunity such that they just about never again become ill with the disease, even when bitten and bitten again by infectious mosquitos.
Malaria for a long while preserved the world order.
In contrast, those from northern climates have no such immunologic patrimony or any winnowing of the immunologically inferior in childhood. Therefore, an adult from Northern Europe going off a few centuries ago to Africa to explore and convert and spread the Western way wasn’t likely to last very long. The marauding British (and French and Dutch and all the others itching for world dominance) typically came down with malaria tout de suite; missions numbering in the thousands rapidly dwindled to a desperate few. Malaria for a long while preserved the world order.
But then came quinine, perhaps the first effective antimicrobial of them all. It comes from the bark of a cinchona tree, which then grew exclusively in the Andes and had been used locally to treat chills (and perhaps malaria, though this is less clear). The Jesuits in 17th century Peru got a hold of it and soon sent some ground cinchona bark back to Mother Rome, where, by legend, it saved the lives of malarious Popes. It is said to have cured England’s King Charles II as well. And so soon it was a very hot property.
There was one practical problem: ground cinchona bark tasted awful—way too bitter for the gentle European palate. And so Europeans sweetened it, giving us so-called tonic water, the “tonic” being the antidote to malaria. (Tonic water made today still notes “contains quinine” on the label). But the Brits in India took it one step further. They added gin, perhaps as a natural extension of the “Gin Craze” or perhaps to further disguise the taste and presto, a star was born. Gin and tonic was a drug that one had to take to survive in tropical climates that also tasted sweet, made a person drunk, and imparted a happy glow to those suffering the pangs of wretched homesickness. Equipped with their gin and tonics, the Brits soon were able to overtake much of the 19th century world, surviving the disease that had killed so many Europeans who had arrived in the same areas just a generation earlier.
Along with their imperial glee, observant Europeans quickly became aware of the immunologic advantage of those raised in the malarious countries the invaders had just subjugated. How easy, it seemed, to just rig up a vaccine – after all, smallpox vaccine using cow pox had been around since the start of the 1800s and variolation, an inoculation using actual smallpox matter had been used for centuries. It seemed easy enough...
And now just 150 years later, a giant baby step towards meaningful control of malaria perhaps has been made. Time will tell if this vaccine or the next one or the next one is the real deal. No matter, the long tale of man versus malaria versus man is a sobering reminder that, unlike a few of the great magical moments in medicine (Fleming’s moldy agar plates or Roentgen’s vacuum tubes) most progress improving human health is measured not by the year or even the decade, but by the generation or – gasp – century. The need for a quick fix leads many to regard scientific progress as some sort of corporate tally sheet to be reconciled against a dollar amount and a human manpower estimate each year.
But as malaria, that ancient disease, has reminded us once again, progress – real progress – simply cannot be hurried. Investigators must have long views, deep pockets, and infinite patience to try and try again. Which is why, even during the moment that the annual Nobel Prizes in Science are announced, a 46 percent solution, if verified, is the biggest science news of the week.