Algorithm May Decide Who Is a ‘Contributing Member of Society,’ Civil Rights Groups Warn

ICE asked Silicon Valley for technology to do ‘extreme vetting’ of immigrants. But how do you define ‘positive contributions’ or an innocent person’s risk of becoming a terrorist?

Photo Illustration by Elizabeth Brockway/The Daily Beast

The Trump administration’s call for “extreme vetting” has immigration officials seeking an automated process that can scrape through immigrants’ personal information and digital histories. But civil rights groups say that means turning over the immigration process to an algorithm that would be a technical and ethical nightmare.

Immigrations and Customs Enforcement is seeking tech firms to build an “overarching vetting contract that automates, centralizes and streamlines the current manual vetting process.” Specifically, ICE seeks to determine potential immigrants risk for terrorism and potential to be a “contributing member of society.”

ICE says its overtures to tech companies are nothing unusual.

“The Department of Homeland Security is tasked with protecting national security by vetting visa applicants to prevent terrorists and criminals from entering the U.S. and ensuring nonimmigrant aliens comply with the terms of their admission to the U.S,” an ICE spokesperson told The Daily Beast. “The request for information on this initiative was simply that – an opportunity to gather information from industry professionals and other government agencies on current technological capabilities to determine the best way forward.”

But such a program would be a discriminatory disaster, over 100 technology experts and civil liberties groups said in two letters to the Department of Homeland Security on Wednesday.

An algorithm-based vetting process has real issues. So few immigrants have committed acts of terrorism, that a computer program couldn’t even generate an accurate predictive model, the coalition of tech experts from some of the U.S.’s top universities and research groups says.

“There is a wealth of literature demonstrating that even the ‘best’ automated decision-making models generate an unacceptable number of errors when predicting rare events. On the scale of the American population and immigration rates, criminal acts are relatively rare, and terrorist acts are extremely rare,” their letter to DHS states. “As a result, even the most accurate possible model would generate a very large number of false positives.”

And an error in the algorithm means misidentifying innocent people as possible criminals, Rachel Levinson-Waldman, a senior counsel at the Brennan Center for Justice who helped organize the pair of letters.

“If you’re building an algorithm trying to find witches, you’re going to end up with the Salem Witch Trials,” Levinson-Waldman told The Daily Beast. “You may think you’re sweeping up witches, but you’re sweeping up a whole bunch of people who have cats and brooms.”

And ICE’s computer program wouldn’t even be the best-case scenario the group describes in its letters. ICE wants a program that will help identify “positively contributing members of society” who “contribute to national interests.”

“But when you talk about people making ‘positive contributions,’ there’s literally no definition of what that means,” Levinson-Waldman said. “American law doesn’t have a definition. DHS hasn’t set out a definition. What that means is that presumably the contractors who build this program, maybe in coordination with ICE, will be coming up with proxies for what those means.”

Those proxies might be income- or employment-based. But the language ICE used when discussing the program with tech companies in July was lifted directly from the language of Trump’s first overturned travel ban against immigrants from seven Muslim-majority countries, leading to concerns that ICE’s program could be written with implicit biases against Muslims or people from the countries affected by the first ban, Levinson-Waldman said.

If you’re building an algorithm trying to find witches, you’re going to end up with the Salem Witch Trials.
Rachel Levinson-Waldman, the Brennan Center
Get The Beast In Your Inbox!

Daily Digest

Start and finish your day with the top stories from The Daily Beast.

Cheat Sheet

A speedy, smart summary of all the news you need to know (and nothing you don't).

By clicking “Subscribe,” you agree to have read the Terms of Use and Privacy Policy
Thank You!
You are now subscribed to the Daily Digest and Cheat Sheet. We will not share your email with anyone for any reason.

Internet users could also be more likely to get caught up in ICE’s algorithmic crackdown. When meeting with tech companies in July, ICE expressed interest in a program that could scan social media and other digitally available information, documents obtained by the Intercept in August reveal.

That could mean new immigration woes for “anyone who might have been outspoken, might have criticized the administration, made jokes online,” Levinson-Waldman said.

Technology experts said an algorithm is even less qualified to scan social media than a human.

“Errors in human judgement about the real meaning of social media posts are common,” the groups note in their letter.

Put in translation and run through an algorithm, those misunderstandings can pile up, like in the case of Palestinian man who was arrested late last month after posting “good morning” in Arabic on his Facebook page. Facebook’s artificial intelligence software mistranslated the greeting as “attack them” in Hebrew. Israeli police arrested the man without consulting any Arabic-speaking officers, who could have pointed out the mistranslation. (Facebook later apologized for the error.)

The groups also pointed to an August report in the Nation, which revealed that the DHS was building tools that scraped social media data and allowed the government to search those social media posts by “tone.”

In ICE’s hands, this kind of tool could have immigrants censoring their speech in an effort to meet ICE’s “vague, unmoored notion of what it means to make a positive contribution,” Levinson-Waldman said.