The Trump administration has revived the debate over “end-to-end encryption” — systems so secure that the tech companies themselves aren’t able to read the messages, even when police present them with a warrant.
“It is hard to overstate how perilous this is,” U.S. Attorney General William Barr said in a speech last fall. “By enabling dangerous criminals to cloak their communications and activities behind an essentially impenetrable digital shield, the deployment of warrant-proof encryption is already imposing huge costs on society.”
Barr has been concerned about this for years, but he has become more vocal recently as encryption goes mainstream. It’s now built into popular services such as Skype and WhatsApp, and even Facebook may soon be encrypted.
Republican senator and Trump ally Lindsey Graham recently floated legislation that would strip tech companies of their liability protection under Section 230 of the Communications Decency Act unless they comply with as-of-yet undefined “best practices.”
Riana Pfefferkorn, at Stanford University’s Center for Internet and Society, says it’s a safe bet the best practices would include a requirement that law enforcement get access to encrypted content.
“The bill as it’s drafted does a bizarre and alarming end run around normal legislative or even agency rule-making processes,” Pfefferkorn says, giving the attorney general “the keys for deciding what rules apply on the Internet.”
Recent headlines have focused on the FBI’s disputes over encryption with Apple, which has refused investigators’ requests to “break into” iPhones recovered from high-profile terrorism attacks. But those situations are relatively rare. In practice, it’s local law enforcement that more often finds itself frustrated by encryption.
“It comes into play at least once or twice every single week,” says Capt. Clay Anderson, who supervises investigations at the Sheriff’s Office in Humphreys County, Tenn.
“Human trafficking and sexual-exploitation-of-minor cases — those are very frequent,” Anderson says, “and in those cases you run into dead ends because you can’t get past encryption.”
There’s often little point in getting a warrant for a sexual predator’s digital messages, he says, because the messaging company isn’t able to produce anything. If they could get those communications, he says, it would be easier to build cases for the prosecution.
“Who needs that kind of encryption, other than maybe the military?” he asks. “We don’t even — in law enforcement — use encryption like that.”
Moxie Marlinspike thinks regular people want it. He’s the software developer who co-created the encryption system used in WhatsApp and other systems.
“People’s expectations when they send someone a message is that that message is viewable by themselves and the intended recipient,” he says. “And people are always very disappointed when that turns out to not be true.”
Marlinspike runs Signal, a nonprofit messaging app popular with security-conscious users such as journalists and government officials. He says they rarely get served with warrants anymore because law enforcement agencies have learned there’s nothing to get.
“That’s the whole point,” he says. “Not even we, the creators of the software or the operators of the service, are capable of inspecting message content.”
It’s an anti-government, anti-Big Brother attitude that’s common in the tech world, and it was challenged recently by Darrin Jones, assistant director of the FBI’s Information Technology Infrastructure Division, at a recent panel discussion with privacy experts.
“Most of these folks have never been a victim and have never worked a crime scene,” he said. “And if that’s not part of your experience, it’s easy to make this a philosophical debate.”
But the privacy activists also have a technological argument against making encrypted messages available to police: It would require companies to create a secret method for decrypting users’ messages — a “back door,” as some call it — which could end up being exploited by malicious hackers.
Marlinspike says systems such as Signal are secure because they’re “open source” software, which means they don’t have any secret programming code where one might hide a back door.
“I think a lot of people might not expect [transparency] to be a good thing for privacy,” he says. But because the software is open source, that “allows experts to review that and make sure there are no vulnerabilities.”
One of those experts is Matthew Green, associate professor at Johns Hopkins University’s Information Security Institute. “The crypto that’s in Signal is really the best stuff out there,” he says. “There’s no such thing as a perfect piece of software, but it’s really well written, and the authors of Signal have thought through a lot of the important security problems.”
If even part of the app’s software were closed, to conceal a back door for police warrants, independent experts like Green wouldn’t be able to vouch for the system’s security with the same confidence.
But law enforcement officials remain dubious that a secure back door can’t be created for their warrant requests.
“To suggest that this is not possible, I just can’t buy that,” says the FBI’s Jones.
He says the security of everyday encryption has to be weighed against the cost to public safety.
“[It offers] some small incremental increase in security in my messaging or my Amazon shopping list, but I have to accept the premise that there are going to be people that are victims? No, I can’t go there.”
Copyright 2020 NPR. To see more, visit https://www.npr.org.