Very few, if any, areas in medicine—and its practice—are absolute. But plenty of sound data exists—if you can find it—to help you make the best decisions.
The problem is, many people choose the wrong options based on misinformation, or they continue to uphold erroneous beliefs and biased opinions, accepted as dogma, despite irrefutable evidence to the contrary.
I’ve noticed this in my own practice. At UCLA, I encounter misinformed patients whose ideas about what’s “good” for them—or a family member—are often wrong. They’ll ask questions about what they’ve heard about in the media or read online and are surprised by my answers, which often contradict conventional public wisdom.
Some examples: My child stopped eating dairy and he doesn’t get sick anymore… I only get the flu when I get a flu shot… Did you know that throat lozenges can cure strep throat? I read that in the newspaper.
The explosion of health-related information leads to misinterpretation, and Americans have a tendency to trust and relate to people with high profiles, elaborate websites, thousands of likes and followers on social media, and heart-rending anecdotes, rather than dry and impersonal medical data with no relatable faces attached to the graphs and text on a page.
But what happens when it intersects with science? Every day the media delivers swarms of health-related information that can swiftly trigger fear or incite us to change our habits overnight. From the headlines about the limitations of the influenza vaccine to the alleged ills of gluten, sugar, and genetically modified (GMO) foods, and the chance of getting cancer from acidic diets, the onslaught of news can be downright overwhelming.
It’s also potentially harmful. One day coffee is good for you and protective against dementia, the next day it’s declared a potential carcinogen. Even more bizarrely, it’s recommended in enema form for improved liver and digestive health.
Claims that routinely circulate are frequently overblown (“diet cures cancer”), misleading (“coffee enemas detox the body”), based on substandard research (“fish oil supplements are good for you”), or completely false (“vaccines cause autism”). Some of the ideas are hocus pocus, created to prey on the vulnerable. Common offenses include exaggerating the benefits of many vitamins, herbs, supplements, homeopathy, anti-aging schemes, cold remedies, and unconventional anticancer programs. Talented online scammers, the 21st century version of the snake oil salesmen of the past, know how to lead the public into believing in causation when there is none, building hype for everything from arnica to zinc.
Unfortunately, many individuals don’t know where to turn for unbiased, trustworthy advice, and the ease with which misinformation proliferates on the Internet leaves people’s heads spinning.
In early March of this year, an alarming study was published in Science that came to some astonishing conclusions. False stories spread significantly more than true ones. Falsehoods were 70 percent more likely to be retweeted—even when controlling for the age of the original tweeter’s account and whether or not Twitter had verified the account as genuine. The old adage about how lies spread a lot faster and farther than the truth remained true.
The MIT-based researchers of the study attributed this phenomenon to two main reasons. First, humans love to devour something that attracts their attention, and false stories often have outrageous, this-can’t-be-true headlines. In the pre-Google days, tabloid magazines caught our attention in the check-out line at grocery stores. Catchy false headlines with shocking visuals are great fodder. But now the stakes are higher and the Internet presents a vastly different situation. Financial incentives are another cited explanation. More eyeballs on a website can mean more money for the site (and its shareholders).
So it’s no wonder the social media advertising market creates incentives for broadcasting false stories, as their wider diffusion makes them more profitable.
And then there’s social media, which has come under fire recently not only for its dissemination of false news but for misappropriating our data that we share online. While many practitioners roll their eyes when they hear, “I did my research,” from a patient, a doctor like myself has to admit that sometimes that research can be sound. If a patient has a rare disease and presents articles about it or has conducted some smart crowd-sourcing on a social media platform, many of us doctors will be grateful that we were saved some extra work.
People also run into trouble when looking for information online based on preconceived notions. If you believe that mega-dosing on vitamin C prevents colds, for example, you will seek out (and easily find) sites promoting this notion. If you think that juice cleanses offer better health and well-being, you’ll land on websites selling just that. And if delaying vaccines is your conviction, online sources abound.
In a world where there are plenty of life-threatening risks to worry about, from traffic collisions to the flu to cancer, we don’t need the news to be hazardous to our health.
Nina Shapiro, M.D, is the author of HYPE: A Doctor’s Guide to Medical Myths, Exaggerated Claims and Bad Advice—How to Tell What’s Real and What’s Not, which St. Martin’s Press will publish on May 1, 2018.