Investigators of the Paris attacks revealed recently that the terrorists used the encrypted apps WhatsApp and Telegram to communicate and coordinate beforehand. This follows reports from San Bernardino that authorities discovered two smashed cellphones at the scene of the rampage that killed 14 people, and they recovered a third one from the body of the female terrorist.
We don’t know what—if any—evidence was obtained from the devices in San Bernardino, but let’s hope they are not late-model smartphones. If they are, the FBI would have a much harder time learning details of the plot or warnings of where and when radical jihadists might strike again. Already, the FBI is concerned that it doesn’t know the contents of 109 messages that a terrorist exchanged with an ISIS operative in Syria before opening fire at a Prophet Muhammad cartooning conference in Garland, Texas.
These and other events have transformed the debate over end-to-end encryption, which until recently seemed the wave of the future. Last year, Apple (iPhone) and Google (Android) began encrypting all new smartphones and throwing away the key. Without a password, it is impossible to find what’s in the phone. Talk to cops and prosecutors, and they will tell you they’re flying blind nowadays, prevented from cracking cases where potential critical evidence exists in smartphones. In the name of privacy, they say, Apple and Google are giving terrorists and criminals of all kinds a huge break.
Consider this transcript from New York’s Rikers Island, where taping the phone calls of inmates is standard:
INMATE: “I need you to open up your iPhone and go to your operating system. If it’s on operating system 8, an iOS 8, they can’t get into my phone.”
INMATE: “What happen is that Sept. 17, 2014, they opened up… It’s all in the papers… The DA Cyrus Vance who’s prosecuting me is beefing with Apple because they put these phones that can’t be [un]encrypted. If our phones is running on the iOS 8 software, they can’t open my phone. That might be another gift from God.”
Vance is indeed “beefing” with Apple, and the Manhattan DA, normally a placid sort, is on fire over how encryption is aiding criminals. “If the average criminal at Rikers knows it, the terrorist knows it, the sophisticated cyber-criminal knows it,” Vance told me. “It’s only a matter of time before there’s an incident where we say, ‘Who gave [Apple CEO] Tim Cook the right to decide whether a parent can find a lost child?’” Vance added later that it would be “no surprise” if encryption impaired the investigation of the Paris attacks, and he renewed his call for federal legislation to “restore the proper balance between public safety and privacy.” FBI Director James Comey testified before Congress that popular encrypted communications apps are becoming standard “terrorist tradecraft.”
That’s still conjecture, but Comey and Vance are right that law enforcement is being handcuffed by “full-disk” encryption. Last month Vance issued a stinging 42-page “Report on Smartphone Encryption and Public Safety” (PDF) that outlines a series of heinous crimes solved by penetrating earlier model cellphones. Vance says that more than 120 Manhattan criminal cases have been harmed by the failure to execute search warrants on the latest smartphones, though because the cases are under investigation, he wouldn’t explain exactly how.
Vance isn’t referring to “data in motion”—the target of bulk data collection and other controversial surveillance—but only “data at rest.” That’s information contained inside a device—pictures, text messages, photos, and other evidence that, contrary to industry claims, is not automatically obtainable in the cloud, which depends on the user switching on software in the phone’s settings. Even then, the cloud often records only the time and numerically coded recipient of communication, not the content of it.
Much of the tech community, bolstered by the Edward Snowden revelations, believes there’s no such thing as a safe “back door” just for the good guys—that once you weaken encryption, you lessen not just security but privacy. But law enforcement isn’t talking about a “Clipper Chip” (a means of government surveillance embedded in phones) or even a standard backdoor key—just a key reserved for executing warrants. “This is not a key that the government has; this is not any sort of master key,” argues Steve Gibson, a software engineer (and originator of the term “spyware”) who hosts the podcast “Security Now” and parts with his tech industry colleagues in siding with law enforcement on this issue. “This would not be an algorithm where, if it got loose, suddenly all Apple iPhones and iOS devices would then be subject to break-in.”
The best analogy is to a house or safe that may contain important evidence in a criminal case. If, after police obtain a search warrant, the suspect refuses to grant entrance, law enforcement has historically required landlords and banks to open the house or safe. They do so under court order and can’t legally allow anyone else in. Tech companies can take similar precautions with their keys. Apple’s having none of this. The company argues that if it reverts to older, less iron-clad encryption, foreign governments could use the key to harass dissidents and other opponents. In typical Apple style, the company offered up a smart, well-informed spokesman who wouldn’t speak on the record. That sounded odd to me, but it turns out none will on this issue.
The Apple argument rests on an assumption that U.S. courts would issue search warrants to foreign governments without cause (i.e., terrorism or other violent acts), which seems unlikely. Apple further claims that if it didn’t provide a key to foreign governments, those countries wouldn’t allow the sale of hugely profitable Apple products in their country. Fat chance. The Obama administration appears to have decided that it will push for more monitoring of social media, but not for changes in smartphone encryption, which would require it to take on the powerful tech industry. Congress, still besieged by pro-encryption email, held hearings but doesn’t seem to be moving toward new legislation.
The debate over encryption has been complicated by two interrelated trends, one political and one technological.
Post-9/11, the USA Patriot Act and other legislation shifted the law and government resources toward public safety and more authority for prosecutors, who already have too much power. (Consider how hard it is to get prosecutors to admit error or indict police officers who act criminally.) After every Paris or San Bernardino, someone, usually on the right, always argues to take away more of our digital freedoms. Meanwhile, revelations about the power of the NSA, corporate America, and random hackers to penetrate even the most encrypted communications have led many cyber-security experts to consider this a “golden age of surveillance” when it’s easier than ever to hack into communications (though not into a locked phone). Even hardliners like former NSA director Michael Hayden object to backdoor access as conventionally defined. No one with any sense wants to give the government a key to encrypted data.
But the search warrant problem remains. What’s required, as Hillary Clinton and others have vaguely argued, is for the tech industry to come up with a solution that allows warrants to be executed without opening the back door wide to government or bad guys.
In other words, we need a device-specific “front door” tech solution—a key that works only for that specific smartphone and can access only its contents. We’d need new laws preventing the collection or use of any information obtained from that phone that is not specifically covered in the warrant.
You’ll hear plenty of pro-encryption arguments for why this is impossible—why we cannot find fresh ways to balance public safety and privacy. But if it’s important enough—and it is—we can.