It happens with disturbing regularity. Police shoot someone who is unarmed, all too often a black male. And as the officer recounts their version of what happened, they frequently repeat the same phrase: “I thought he was armed.”
It’s impossible to say exactly what the officer perceived and the visual information their brain used to determine a person was armed. Far from acting like reliable, high-definition cameras, our vision is actually rather imperfect, transmitting only bits and pieces of the whole picture and leaving the rest for our brain to fill in. And in high-stress situations, says a new study in the Journal of Experimental Psychology, our brains prioritize the processing of coarse features rather than the fine details that would enable someone to tell the difference between a real gun and a cellphone, can of soda, or even a toy gun.
“How stressed we are affects how we perceive,” said Karin Roelofs, a neuropsychologist at Radboud University in the Netherlands, and senior author of the new study.
All students in Introductory Psychology classes learn about the fight-or-flight response and how, when an animal perceives a threat, it prepares to take a stand or run away. There’s also a third option, in which the animal freezes in place. It’s the deer-in-headlights phenomenon and serves to protect the animal from predators that often hunt by detecting movement. In dangerous situations, humans will freeze, too, our nervous systems governed by the same hundreds of millions of years of evolution. Animals use the time while “frozen” to take in information about their surroundings and make the decision whether to fight or run.
Scientists generally believed that freezing behavior heightened sensory perception, but no one had actually measured this in the lab. What Roelofs and her team wanted to know was how feeling threatened and the subsequent freezing behavior altered visual perception in humans. She recruited 34 healthy young adults to complete a task that asked them to judge whether a series of lines were horizontal or vertical. Some of the options contained a few, large lines, which simulated coarse information, whereas others contained many thinner lines to simulate fine detail.
But there was a catch. Roelofs also intermittently displayed a red or a green dot. Occasionally, the red dot was followed by a mild electric shock that was unpleasant but not painful or dangerous. The sight of the red dot elicited freezing behavior. When Roelofs and colleagues measured how well the subjects did, they found that the stressed and fearful conditions improved their abilities on the low-detail images but hampered their judgement on the high-detail images.
“The brain is always making predictions about what we see. It’s generally more important to know if something’s there than what it is,” Roelofs said.
These and other studies help to underline the close links between emotion and perception.
The brain has certain templates that help us predict what to expect, says Aprajita Mohanty, a psychologist at Stony Brook University. If you’re driving on snow, you instinctively look out for icy patches. If you see a black person, years of growing up in a prejudiced culture may make you assumed they are armed and dangerous. “Your brain is never really walking into a situation blind,” she said.
However, Roelofs cautions that her study took place under controlled lab conditions, which makes it difficult to say exactly how these results might apply to the real world, where tense situations often require split-second decisions. Other studies provide some detail that provides clues about how the brain makes rapid decisions while under stress, such as when a cop pulls a gun on a civilian.
Racial bias is everywhere in America, and police are no more immune than anyone else. Social neuroscientist Daniel Amodio of New York University has spent his career studying how thoughts and emotions, including stereotypes, affect perception and behavior.
One of his studies asked a racially diverse group of individuals from different countries around the world to play a computer game in which they were the police officer and had to decide whether the person on screen was armed and whether to shoot them. Regardless of the ethnicity of the participant, the Americans were far more likely to shoot African Americans, regardless of whether or not they were armed.
When Amodio and his team tracked the eye movements of the participants to see what they were looking at, he found that people always looked at the face of the person on the screen before they shifted their gaze to the object they were carrying. The problem was that they had made the decision about whether or not to shoot before they turned their attention to the object to determine whether it was a gun or something non-threatening. Other of his social neuroscience studies show that people often show decreased neural processing of faces from different racial or ethnic groups, meaning that people see them as being, in some ways, less human.
“If you’re under stress and need to act quickly, you tend to rely on mental shortcuts” such as prejudice and stereotypes, Amodio said. “If someone is amped up and afraid because there might be a shooter and they see a kid, like with what happened with Tamir Rice, they say that when I drove up, I saw a man who looked to be armed. In the split second it took for the officer to drive up and shoot, it’s quite possible that all of these instincts lead to that decision.”
This isn’t to say that the appropriate response to these shootings is a defense of “my brain made me do it.” Rather, the goal of his work, Amodio says, is to try to counter these prejudices and understand how people make these snap decisions to provide better training to police officers. Racial bias plays a key role because it’s the raw material from which the brain fills in our perceptual gaps.
“We need to raise awareness of how bias might affect what police officers see,” Amodio says.
To Mohanty, our perception is driven by our biases. “There is no such thing as true perception,” she said. “Modifying our beliefs can change how we perceive.”