'We don't have a way to code good guy'
A "magical rainbow unicorn key" for encrypted devices may be what law enforcement wants, but such access could mean bad actors would also have access to data, privacy activists argue.
(Ninya Pavlova & iunewind / Shutterstock)
If U.S. law enforcement and intelligence agencies get a route around encryption, what’s to stop foreign nations from doing the same?
And is such a route even technologically feasible?
At a discussion last week sponsored by the Christian Science Monitor’s Passcode security blog, FBI Executive Assistant of Science and Technology Amy Hess made a circumspect case for access to encrypted personal data – while stressing that she wasn’t calling for any new laws just yet – as privacy advocates pushed back.
Front door, back door, same difference?
Echoing former NSA chief Gen. Keith Alexander, Hess tried to frame the discussion in positive terms.
Time and again, Hess refused to put the apparently binary dilemma – should authorities be able to bypass encryption on personal devices, or shouldn’t they? – in binary terms, calling instead for “balance [in] the discussion” and denying that federal authorities were demanding a “golden key” to devices.
“I will be the first person to tell you that we’ve done a really bad job of collecting empirical data,” she admitted when she was asked about the number of FBI investigations that have been stymied by encryption of personal devices.
She shared a hypothetical horror story instead, claiming that in one case, a child pornographer was only caught because of metadata on images retrieved from his smartphone – information that would not have been available to cops if the criminal had strong encryption on his phone.
“We really need the companies to try to come up with the solutions,” she said, emphasizing that she doesn’t want law enforcement to be able to directly pull information off personal phones. Instead, she simply wants companies to be able to comply with court orders for users’ data, meaning they would have to retain data and access to it.
But if American companies have to comply with American court orders, what happens around the rest of the world?
It wouldn’t just be us
“We live in a multi-country world where there are not shared values across the countries,” noted Jon Callas, security expert and co-founder/CTO of Silent Circle. If companies are forced to keep users’ data and hand it over to U.S. authorities, other nations are sure to start clamoring for the same thing – and they might use the information to commit human rights violations or even kill citizens, he noted.
Laws mandating encryption backdoors in the U.S. would turn tech companies into “supra-governmental authorities” charged with making life-or-death decisions around the globe, Callas warned.
“We don’t have a way to code ‘intent,’” Callas noted. “We don’t have a way to code ‘good guy.’”
And even “good” countries spy on one another all the time, he added, meaning tech companies really have only one good option: say no to all the governments.
“We’re kind of like Odysseus having to lash ourselves to the mast,” he said. “If we yield to one, we have to yield to all. Our best decision is to make it so that nobody other than the end user can get in.”
Justice Department attorney Kiran Raj didn’t have an answer for the geopolitical issues, but he made the case that companies already keep and access user data, backing up Hess’ point that companies should be able to comply with government orders.
“I just don’t see a world where companies don’t have access to [ostensibly private user] information for their own business purposes,” Raj said.
“It doesn’t make sense for the government to mandate some kind of access regime,” he added, but said the general principle should be easy to apply. “The government goes to the company with a court order, and the company provides the information.”
“A warrant is not a right to that data,” Callas retorted. “It is a right to perform a search to get that data.”
Will it wreck security?
ACLU technologist Christopher Soghoian took to Twitter to press Hess on the FBI’s use of encryption, apparently looking to demonstrate some level of hypocrisy.
“We need our information obviously encrypted and protected,” Hess acknowledged. “We support strong encryption!”
She said, without explaining how it would work, that strong encryption and giving the FBI access to personal data could go hand-in-hand, creating a more secure digital landscape. And she noted the potential danger of unlockable systems. “If we go to 100 percent secure systems that nobody can access, ever, are we comfortable with that?” she asked.
For Callas, the answer is yes. And he didn’t buy the lines about beefing up overall security.
“[Third-party access to data] ruins the security that we’re building into things,” Callas said. “We’re putting in the security precisely to stop crime, precisely to stop espionage. It is the companies that are putting in security that you’re getting upset about.”
Another thing he didn’t buy: Law enforcement semantics about front doors and golden keys.
“You’re not asking for the golden key, you’re asking for the magical rainbow unicorn key” that grants only “good guys” access to data while maintaining the security of encryption, Callas said.
NEXT STORY: Pentagon Races to Boost Cyber Troop Size