Fever isn’t an illness. It’s the body’s attempt to fight illness. So when we treat fever with antipyretics, like acetaminophen (Tylenol) or ibuprofen, we only handcuff an important part of our immune response. Although it might seem counterintuitive, several studies have now shown that antipyretics increase the severity of infections. The time has come to get over our fear of fever.
Much has been learned about the importance of fever from studies in animals, which can be divided into two groups: ectotherms and endotherms.
Ectotherms regulate their body temperature using the environment. For example, when lizards want to raise their temperature, they climb to the top of a rock and sun themselves. When they want to lower it, they crawl under the rock.
Mammals, on the other hand, are endotherms. To increase our body temperature, our immune system releases chemicals called cytokines (like interleukin-1, interleukin-2, interleukin-6, interleukin-8, tumor necrosis factor, and others) that travel to a part of the brain called the hypothalamus and reset the body temperature to a higher level. To achieve a higher temperature we shiver, crawl under the covers, wear warm clothing, and shunt blood flow away from our arms and legs and toward our core.
In the mid-1970s, Matthew Kluger, a scientist in the Department of Physiology at the University of Michigan, performed a groundbreaking experiment. He infected lizards with a bacterium called Aeromonas hydrophila. Then he put the lizards in chambers at 38 C (normal lizard temperature), 40 C (low fever), and 42 C (high fever). At normal temperature, 75 percent of the lizards died, at low fever, 33 percent, and at high fever, 0 percent. These findings were later extended to goldfish infected with Aeromonas, mice infected with coxsackie B virus or Klebsiella, rabbits infected with Pasteurella, and dogs infected with herpes virus. In every case, animals prohibited from having fever were more likely to suffer or to die. All of these studies proved, as Kluger had postulated, that fever was an adaptive, physiologic, and necessary part of the immune response.
The first evidence that fever was important in people came before Kluger performed his studies in animals. In the early 1900s, before antibiotics to treat bacterial infections were discovered, Wagner von Jauregg injected malaria parasites into the bloodstreams of people with syphilis. The parasites caused high fevers and shaking chills for several days, after which he treated the patients with quinine: an anti-parasitic drug that had been available since the mid-1800s. He found that the high fevers caused by malaria cured syphilis. For this achievement, von Jauregg won the Nobel Prize in 1927. His observations were later extended to include using fever to treat gonorrhea.
Von Jauregg had shown that fever could be used to treat infections, begging the question of whether reducing fever worsened infections. Many studies have now been performed in children and adults to address this question. The results have been consistent:
• Antipyretics prolonged the excretion of salmonella bacteria in people suffering from this intestinal infection.
• Children with bloodstream infections (sepsis) or pneumonia were more likely to die if their temperatures were lower.
• Antipyretics prolonged symptoms in patients infected with influenza.
• Antipyretics prolonged viral shedding and worsened symptoms in volunteers experimentally infected with a common cold virus called rhinovirus (PDF).
• Antipyretics delayed the resolution of symptoms in children with chickenpox.
Consistent with these clinical observations, recent studies have shown why fever is so valuable. At higher temperatures, white blood cells (neutrophils), B cells, and T cells work better. Each of these components of the immune system is important in resolving infections. Neutrophils kill bacteria. B cells make antibodies that neutralize viruses and bacteria. And T cells kill virus-infected cells.
Given all of this information, why are we so intent on treating fever? Why are we so fever-phobic? One reason is that we equate fever with illness. We assume that if we lessen fever than we have lessened the degree of illness, when the opposite appears to be true. Another reason is the fear that high fevers can cause brain damage—a concern that hasn’t held up to scientific scrutiny. Yet another reason is the notion that treating fever will prevent febrile seizures, a phenomenon that while frightening, doesn’t cause permanent harm. As it turns out, antipyretics also don’t prevent febrile seizures.
Probably the most common reason for treating fever is that we feel more comfortable when our temperatures are normal. Fever increases the basic metabolic rate, causing us to breathe faster and our hearts to beat faster. When we have fever all we want to do is lie in bed and drink fluids, which is exactly what we should be doing instead of going to work or school and infecting others. Fever is a sign that we should isolate ourselves from the herd.
Pharmaceutical companies haven’t helped. With ads like “Let’s get that temperature down!” “Just what the doctor ordered,” and “When fever and aches have little Tyler corralled—Tylenol,” we are constantly bombarded with the notion that fever is bad and must be reduced or eliminated.
Of interest, Hippocrates, who lived around 400 B.C., had it right. He believed that disease was caused when one of the four humors (black bile, yellow bile, blood, and phlegm) was produced in excess. Fever, according to Hippocrates, cooked the raw humor, leading to healing. Then, in 1899, the German company Bayer invented aspirin. Now, suddenly, it became important to treat fever, popularized by the advice, “Take two aspirin and call me in the morning.”
In the final analysis, we should have listened to Hippocrates.
Paul A. Offit, MD, is a professor of pediatrics and director of the Vaccine Education Center at the Children’s Hospital of Philadelphia. His most recent book is Pandora’s Lab: Seven Stories of Science Gone Wrong (National Geographic Press, April 2017).