Tech

Why We Fall for Fake Science

BLIND FAITH

The retraction of a Science paper claiming a 20-minute talk with a gay canvasser can change views on gay marriage may not quash the theory. The reason? We trust bogus science.

articles/2015/05/23/why-we-believe-fake-studies/150522-chituc-gay-canvasser-tease_wbeaxw
Photo Illustration by The Daily Beast

Over the last few decades, research in the social, cognitive, and political sciences has shown again and again that earnest, lasting change on a contentious issue is an extremely rare thing.

The body of research certainly paints a pessimistic picture—on issues from climate change to vaccines to evolution to anything else, really, it seems impossible to change a person’s mind about anything with even the slightest personal or political charge. The New Yorker’s Maria Konnikova recently provided a beautiful survey of this literature, and it seems that the best you can hope for in your persuasive efforts is wasting your time; the worst is that you just make the other person more entrenched and sure of their beliefs.

It’s been something of a unicorn in social psychology to capture real ideological change in the lab or the field, and it remained elusive until this past winter, when research by Michael LaCour and Donald Green showed that a simple 20-minute conversation with a gay canvasser can have a lasting impact on someone’s support of gay marriage.

ADVERTISEMENT

These findings, published in Science, received widespread media attention, and understandably so. The problem, though, is that this unicorn—like all others—isn’t real. Green, the study’s senior author, publicly shared his letter to Science asking for the study to be retracted; LaCour, the primary author and a graduate student slated to start a job at Princeton this July, had fabricated the results.

Looking back, the findings really were too good to be true. LaCour, the lead author, told The Washington Post that a 20-minute conversation with a gay person could take someone’s attitudes on gay marriage from a Nebraska to a Massachusetts over the course of a year.

“I truly did not expect to see this,” Green told The New York Times before the fabrication came to light. “I thought attitudes on issues like this were fundamentally stable over time, but my view has now changed.” On a recent episode of This American Life, Ira Glass discussed these results and said that Green and his colleagues read 900 papers and “they haven’t seen anything like this result.” Vox even went so far as to call these findings “miraculous.”

The LGBT canvassing operation really does exist, but it seems like the follow-ups purporting to show a sustained change in their views never happened. According to a report they released yesterday, two graduate students, David Broockman at Stanford and Joshua Kalla at U.C. Berkeley, were intrigued by LaCour and Green’s results and wanted to build on it with their own research. As they began their pilot research, they noticed discrepancies between what they saw and what LaCour and Green reported. As they tugged at these discrepancies, LaCour’s story quickly unraveled.

Their report, prepared with the help of Peter Aronow, a professor at Yale, is a fascinating document that provides surprising insight into a clear instance of academic fraud. Ira Glass has since spoken with Green to get the full story, which is remarkable in and of itself.

There’s a further irony to all of this, though, and it’s really something beautiful.

LaCour and Green’s research built off of a body of work showing how difficult it is to correct misperceptions and change minds. Their findings, for the first time, suggested that maybe you really could change beliefs and correct misperceptions, and it was a rare ray of optimism. Now that we know their research was fabricated, though, the only studies we’re left with suggest we’re so insensitive to corrective information that we’ll just keep believing these findings are real, anyway.

One study looked at the pervasive belief that the Affordable Care Act would include “death panels.” Half the participants just read an article about Sarah Palin’s statements on death panels, the other half read about her statements along with information correcting it. The corrective information worked, but not for Palin supporters familiar with politics—those participants became more likely to believe in death panels and oppose health-care reform.

Looking at WMDs in Iraq, the impact of tax policy on government revenue, and federal policy about stem cell research, another study found that these misperceptions were all similarly insensitive to corrections when there was an ideological basis for those beliefs. There were also similar “backfire effects,” where corrections actually made some misperceptions stronger.

Most recently, a study looking at correcting misperceptions about vaccines showed similarly abysmal results. Researchers tried four different ways to convince nearly 2,000 parents that vaccines weren’t harmful. They cited the Centers for Disease Control and Prevention showing that there was no evidence of a link between vaccines and autism, they gave information about the harms of the diseases vaccines prevent, they showed images of children with the diseases vaccines prevent, and they provided a heart-wrenching anecdote about an infant who almost died of measles. None of the interventions made parents more likely to vaccinate their child. Giving information from the CDC successfully convinced some parents that there wasn’t a link between vaccines and autism, but it somehow made parents less likely to vaccinate their child. Some of the interventions increased the belief that vaccines were harmful.

Of course, what makes this all so perplexing is that we know beliefs change, we just don’t know how. Support for same-sex marriage has rapidly increased in recent years, but no one is quite sure why. Oftentimes, trying to figure out what’s responsible for an ideological shift seems as hopeless as trying to figure out which individual flame on a gas burner causes a kettle to boil.

I reached out to both LaCour and Green via email. LaCour told me what he’s been telling everyone—he’s gathering evidence to release a public statement. Green, however, was willing to answer a few questions. I asked him whether he still believed these findings, given that he started as a skeptic about genuine ideological change, and he just retracted the data he had said convinced him. “As you know, a fascinating literature has emerged in recent years suggesting that people are influenced by information even after it has been discredited,” he told me. “I'm trying not to fall prey to that same cognitive bias.”

Of the gay canvassers, he said, “I certainly think it’s plausible that they were persuasive, but I question whether the persuasive effects endure for more than a few days or weeks. The only way to know is to do this study again, which I fully intend to do.”

Got a tip? Send it to The Daily Beast here.