Rosenstein's ‘Responsible Encryption’ Proposal is Nonsensical, Privacy Expert Says
Companies could build backdoor access into products, but it leaves users vulnerable.
A top Justice Department official wants the tech industry and law enforcement to find common ground on “responsible encryption,” but privacy advocates don’t think his plan will prevent the government from abusing its power.
Deputy Attorney General Rod Rosenstein said federal agencies shouldn’t have the ability to directly break into encrypted personal devices, but tech companies should retain backdoor access to assist investigators in criminal cases. Striking a balance between “data security” and “the needs of law enforcement” would enable agents to pursue crimes with exclusively digital evidence, he said Monday at the 14th annual State of the Net conference.
In cases where proof of the crime is usually not physical but digital, like child pornography, investigators can’t access incriminating evidence unless they have access to the encrypted devices where it’s often stored, Rosenstein said.
“You’re going to be allowing criminal activity to occur without law enforcement being able to intervene,” he said. “We favor encryption, but not encryption to the exclusion of legitimate law enforcement concerns.”
But even if tech companies act as middlemen between federal investigators and encrypted information, the fact that a backdoor exists in the first place reduces the inherent security of personal devices, according to Joseph Lorenzo Hall, chief technologist at the Center for Democracy and Technology. He called Rosenstein’s proposal “nonsensical.”
Even if companies don’t misuse encryption keys, Hall said, creating an intentional backdoor would leave everyone more vulnerable to cyberattacks. This becomes especially concerning considering vulnerabilities would be built into not only the personal devices of everyday citizens but also those belonging to top government officials, he said.
Tech-savvy lawmaker Rep. Will Hurd, R-Texas, called it “technically impossible” to have strong, secure encryption with any kind of backdoor, and Rosenstein too admitted citizens would indeed sacrifice some security if companies retained backdoor access.
“Having a key creates more risk than having no key,” he said. “The question is can you engineer a system that is sufficiently secure [where] there is adequate assurance that they will not be wrongfully accessed.”
Rosenstein’s plan would also leave room for government to potentially abuse its power, Hall told Nextgov. Even if agencies didn’t own encryption keys themselves, they could acquire them with a single court order, he said.
The federal debate over encryption largely began with the high-profile fight between the FBI and Apple over unlocking the iPhone used by San Bernardino shooter Syed Farook, which the bureau eventually paid another company to open.
While law enforcement officials argue “responsible encryption” is necessary to keep the country safe, Hall said “trusting citizens with certain freedoms” is one of the central foundations of democracy.
“[Agencies] seem to think anything they have a lawful court order to get access to, they should get access to,” he said. “We think there should be clear places where the government should never be able to reach.”
Hall told Nextgov he thinks agencies should set boundaries for how much direct access they have to citizens’ digital lives, but he said investigators should take advantage of the “ubiquitous” security holes in personal devices to access information stored on them.
He said government hacking operations or the intensive police work used to capture Silk Road founder Ross Ulbricht offer ways for investigators to get around encryption without compromising people’s right to privacy.
NEXT STORY: How to Stay Cyber Secure at the Winter Olympics