BRING IT ON
03.08.13 9:45 AM ET
Why I’m Not Worried About Dying From a Superbug, and You Shouldn’t Be, Either
Pity the poor public-health official: in the midst of an epidemic, he must adopt a soothing avuncular tone of near-boredom, a “we’ve seen this, not to worry” sort of yawn to calm people who otherwise seem ready to run screaming into the streets. But on the other hand, in this day of sequestered public-health funding, he has to raise a major ruckus about some other problem that might happen, swearing that the earth may end soon if we don’t wake up now and face the music. The cavalcade of past get-ready-for-the-big-one hits includes drug-resistant TB, avian flu, swine flu, and drug-resistant gonorrhea among others, each introduced with shrill press releases and snapshots of grim faces peering through microscopes.
It is no surprise, therefore, to see the CDC roll out the heavy artillery this week by proclaiming the dangers of the latest superbug. This one is ugly for sure, a resistant-to-almost-everything bacteria that preys on the hospitalized patient. Called carbapenem-resistant Enterobacteriaceae, or CRE, to denote the class of antibiotics (carbapenems) to which it is resistant, and the group of bacterial organisms—Enterobacteriaceae, bacteria that reside in the gut—to which it belongs, CRE is being seen increasingly in hospitals across the U.S. Unheard of before 2001, CRE now is in 181 (4.6 percent) U.S. acute-care hospitals, affecting hundreds of patients. In August 2012, the NIH Clinical Center had a widely reported outbreak from a CRE that killed six of 18 patients, the mortality rate seen in most series.
The CDC and other public-health officials are particularly alarmed by this latest wrinkle because the carbapenem class was the last thoroughly modern group of antibiotics with predictable activity against gut bacteria. With the carbapenem hegemony now wobbling, the next (and last) antibiotic is an oldie from the 1960s, pulled from the market then because of concerns about toxicity, but now being used in many hospitals and ICUs to treat CRE infection. If and when CRE becomes resistant to this old-timer, the cupboard is truly bare.
This sort of progressive resistance to antibiotics is standard operating procedure for bacteria exposed to high doses of potent antibiotics over time; resistance can and must occur according to the most basic principle of evolution: survival of the fittest. If a billion bacteria are exposed to an antibiotic and just one bacterium, because of a chance mutation, is resistant to the antibiotic while the other near-billion are not, that single organism will survive while the others will die off. The resistant organism will then have the run of the place with enough nutrition to support the billion now-absented brethren, allowing the resistant clone to take root and get in position to spread.
We have been here before of course: methicillin-resistant Staphylococcus aureus (MRSA) played through the hospitals and the headlines (and even the National Football League) last decade, alarming the public and spurring new regulations to contain it as well as the application of money, sort of, to develop new weapons. Perhaps because of all the hubbub, MRSA now seems almost quaint and surely not a headline-screaming scourge: mostly contained, a nuisance, a problem, but being dealt with at the right place by the right people. In other words, it has assumed its proper proportion in the world of threats and dangers.
The same likely will happen with CRE. More cases will occur, hospitals will make the necessary adjustments suggested by the CDC, specialists will learn their way around the diseases, and eventually the threat and the excitement around it will flatten out. And then the next red-hot development on some other front will emerge rendering the acronym to oblivion. The problem though is this: the mix of steady CDC concern about a real issue that requires attention, a world with infinite capacity for both news and “news,” and a perverse public enjoyment of being frightened has succeeded in little other than scaring the crap out of people who might need medical care. Indeed, hospitals seem to occupy the same imagined place as the Overlook Hotel, the cavernous inn Jack Nicholson prowled in The Shining—the last place on earth a sane person would go. Health care in general and hospitals specifically are viewed these days by just about everyone as a veritable killing field, the place where the two inevitabilities—death and taxes—meet daily as people are fleeced then killed.
Such is not the case. Honest. Yes, I know I am tainted goods because of my conflict of interest: I work in a hospital and I believe in medical care. But please remember that people in ICUs, where CRE and so many other deadly infections lurk, are not denizens of executive suites. They are already quite ill, usually with multiorgan failure from the heart attack or stroke or high-speed automobile crash that brought them to emergency medical care. They then are exposed to the high-tech ballet of life-sustaining futuristic machines, venous and urinary catheters, potent and often toxic medications, and all the rest. They also are exposed to the bacteria in their own intestines, mouth, and skin, as well those in the environment, much less the imperfectly cleaned hands of hospital staff. Horrible, heartbreaking, and fully preventable things happen in ICUs, but so too are many lives saved.
The demonization of health care has occurred simultaneously with our deepening fascination of the promise of tomorrow, an almost religious belief that medicine is just inches away from conquering just about everything. These two fantastic extremes pervert reality with equal force and fully obscure the truth about medical care in 2013: we are neither in a hell of ineptitude and willful neglect nor just inches from the next great golden age of health. And though hospitals are complicated, difficult places to spend time, the view that, to preserve health, it is safer to avoid care than to seek it is a dangerous and troubling delusion.