Social media companies have pledged to crack down on anti-vaccination messages that have been blamed for this year’s historic measles outbreaks. But a test by The Daily Beast reveals just how easy it is to place an ad filled with blatant medical misinformation on some of the world’s biggest online platforms.
The Daily Beast submitted anti-vaccination ads via Google, Facebook, Instagram, YouTube, Twitter, and Snapchat. Google and Twitter approved ads that repeated widely debunked claims, advocated directly against vaccination, and linked to conspiracy theory websites.
Facebook and Instagram rejected the Beast’s ads, though both have recently allowed anti-vaccine groups to hawk misinformation while simultaneously removing ads from hospitals and government health departments. Snap and YouTube also rejected The Daily Beast’s ads.
Dr. Sean O’Leary, a pediatric infectious disease specialist and spokesman for the American Academy of Pediatrics, said companies that accept such ads are failing their users.
“By spreading this misinformation, they are actively harming children,” O’Leary said. “They shouldn’t accept advertisements from organizations that spread anti-vaccination misinformation.”
The anti-vaccine movement has used social media to turbocharge a dangerous message, one that thrives in conspiracy-prone corners of the internet—and experts say it’s been successful.
The World Health Organization designated vaccine hesitancy a Top 10 threat to public health in 2019. The United States is in danger of losing its official designation as a nation that has eradicated measles. The percentage of young people who don’t get any vaccines has skyrocketed over the past decade, even though scientists are nearly unanimous that vaccination is safe and effective.
The American Medical Association, the nation’s largest group of physicians, has pressured big tech companies in recent years to limit the spread of anti-vaccination content. And some have promised to do better.
Medical conspiracy theorists, like others looking for ways around content moderation policies, often cloak their ideas with innocuous framing to bypass algorithms trained to flag more overt language.
But The Daily Beast took no such measures when it submitted ads for review. And the approvals by Google and Twitter call into question how well the companies’ moderation methods weed out anti-vaccination content.
The Beast briefly deployed three different ads in its test. One ad read “Don't get vaccinated | Know the truth first | Call now” and linked to Michigan for Vaccine Choice, a group that advocates for expanded vaccine exemption policies. Another said, “Read this before you vaccinate | Want to know the truth? | Vaccines aren't safe” and linked to vaccineholocaust.org, an anti-vaccination conspiracy theory site. A third said, “Parents for vaccine choice | Medical freedom now” and linked to Children’s Health Defense, an anti-vaccine nonprofit chaired by one of the nation’s most prominent anti-vaxxers, Robert F. Kennedy Jr.
Google approved the two advertisements with blatant language—“Don’t get vaccinated” and “Vaccines aren’t safe”—and even sent multiple prompts via email to optimize them. It rejected the ad with coded language, “Parents for vaccine choice | Medical freedom now,” calling Children’s Health Defense a “known culprit.” That might have suggested the group’s site had been blacklisted. But as Dr. O’Leary browsed search results for “vaccines” during a phone call with The Daily Beast, Google was served up an ad for Children’s Health Defense atop a list of legitimate health-care sites.
Google allowed our ads to target people searching for conspiracy-minded terms like “vaccines cause autism,” “MMR vaccine autism,” “vaccines autism,” and “mmr autism” as well as the more benign “vaccination” and “influenza,” meaning some people searching for those phrases could see our ads among their results. According to Google’s analytics, the largest number of people clicked on the ads after searching just for “vaccination.”
Twitter approved promotion of three tweets: “Vaccines aren’t safe #vaccinescauseautism,” “Don’t get vaccinated” (featuring a screenshot of vaccineholocaust.org), and “I support Michigan for Vaccine Choice. Parents want medical freedom now!” The day after the promotion, it appeared that Twitter removed the account used to promote the tweets.
In a statement Friday, Google admitted it screwed up. “While our systems work correctly in the vast majority of cases, these ads shouldn't have been allowed to run and they’ve been disapproved,” it said. “We prohibit ads that promote anti-vaccination content. We know there is always more work to do and we are continually working to improve our processes.” Twitter did not immediately respond to requests for comment.
Anti-vaxxer propaganda isn’t illegal (though some legal scholars have argued it should be), so the legal responsibilities of tech companies are less clear than they are with outright criminal content. But doctors maintain that these groups are promoting ideas that will harm people, especially children.
Dr. Dan Summers, a pediatrician in private practice in Boston and leading pro-vaccine voice who has written for The Daily Beast, called Google and Twitter’s acceptance of the ads “egregious.”
“Those messages are false, misleading, and they contribute to an ongoing public health problem in the United States. It’s inexcusable to run those ads,” Summers said.
Facebook promised a crackdown on anti-vaccination misinformation in February. Instagram, owned by Facebook, says it downranks users who post anti-vaxx content. Google has said it is improving its search products to prioritize verified information from authoritative sources. But the hydra of misinformation always grows more heads: the Daily Mail reported Monday that Google promised to remove ads for sham homeopathic pills targeting people searching for the measles, mumps, and rubella (MMR) vaccine. YouTube, Google’s subsidiary, has said it will remove advertisements and restrict recommendations for some anti-vaxx videos while directing users to verified information. Pinterest blocked search terms related to vaccines altogether in February because of the continuing prevalence of anti-vaxx content in its search results.
But the results of The Daily Beast’s review point to confusion and an unevenness in the moderation of such controversial material. Even with such flagrant phrases as “Don’t get vaccinated,” “Vaccines aren’t safe,” and “#vaccinescauseautism,” the ads passed through Google and Twitter’s content review processes.
It’s not the first time, either. Google and Facebook previously allowed ad buyers to target racist and anti-Semitic search phrases. Facebook agreed to a landmark settlement earlier this year that mandated the company bar buyers from targeting specific demographics when their advertisements fall into specific categories—namely credit, housing, and job opportunities—that are protected by anti-discrimination laws.
Experts say medical misinformation is so insidious because it undermines public confidence in vaccines—and if enough skeptics refuse to immunize their children, entire populations can become vulnerable to preventable diseases. Summers attributed the recent measles outbreak, the largest since 1992, in large part to false information found online.
“At a certain point you dip below the percentage [of vaccination] necessary for herd immunity, which allows for the re-emergence of these illnesses. It’s not something that can be prevented with anything other than maintaining a certain threshold of vaccination. Everything that is against that harms public health,” Summers said.
“They shouldn’t be taking money to promote that harm,” he said of Google and Twitter. “It’s not surprising, but it is disappointing. You’d think after [Twitter founder Jack Dorsey] talked so loudly about banning political ads he would be able to turn that discretion to this obvious issue with false information.”
O’Leary consults with other pediatricians on how to increase uptake of vaccines and combat the growing doubts about their safety.
“In some pediatric practices it’s a daily occurrence that people will come in skeptical about vaccines, and it’s often because of misinformation they’ve seen online,” he said. Pediatric patients sometimes die of preventable illnesses because their parents refuse to vaccinate them, he said.
O’Leary said Google’s search results have improved since a decade ago when he began monitoring them for misinformation, and searching for vaccine-related terms today does return results from trusted scientific organizations like the U.S. Centers for Disease Control. But anti-vaxx ads raise the possibility that an organization could convert a skeptical parent by paying to circumvent a ranking algorithm or a ban.
“It’s surprising to me that they’ll take money for misinformation,” he said. “The algorithm seems to be pushing folks towards more legitimate websites. It looks like they’re putting forth a reasonable effort to stop the spread of misinformation online, but the ad thing is troubling because it’s another way of spreading misinformation.”
Laura Bono, director of marketing at Children’s Health Defense, said Facebook is the anti-vaxx nonprofit’s platform of choice, though increasingly less so since Rep. Adam Schiff (D-CA) sent a letter in February demanding Google and Facebook curb the spread of medical misinformation.
“It’s getting more difficult to boost anything on Facebook,” Bono told The Daily Beast. “Our reach has been considerably turned down... Still, I’m thankful for Facebook, we reach a lot of people that way. Advertising doesn’t make or break us.”
Bono said Schiff’s letter was “asking for censorship” and “seemed like First Amendment problem,” an assertion both O’Leary and Summers vehemently disagreed with.
“They can say anything they want, but Google or Twitter isn’t under any obligation to amplify it,” said Summers. “These are privately held companies. Nobody has a constitutional right to post anything on a privately owned platform.”
Summers thinks the minimum internet companies can do is refuse anti-vaxx ads. He hopes they’ll promote pro-vaccine content for free.
“I think the very least you can do is decide that you’re not going to actively participate in the promotion of scientific misinformation. That should be the lowest bar,” he said.