Fortress Britain's Coming Crackdown
In the wake of the Manchester attack, the U.K. government is stationing troops in cities and fast-tracking new laws to access encrypted messages.
How do you tighten surveillance in a country that already has one of the highest levels of security monitoring in the world? Britain is about to find out.
In the wake of the horrific attack on Manchester on May 22, the U.K. government has raised the country’s threat level from severe to critical—a state of alert it adopts when a terrorist attack is believed to be imminent. Troops have also been deployed on Britain’s streets to provide extra security, a sight unseen since 2003, when armored vehicles were sent to Heathrow Airport in response to a terror threat.
It’s unusual, and alarming, to see soldiers in key sites such as London’s Downing Street. But their presence on Britain’s streets will probably not be the most striking aspect of the U.K. government’s post-Manchester security crackdown. What will likely have a far greater long-term effect is the acceleration of proposed laws to give British security forces unprecedented access to monitor citizens’ internet use—above all, their use of encrypted messaging services such as WhatsApp. It is believed that the Manchester attacker sent a WhatsApp message shortly before Tuesday’s attack. On Tuesday, government sources told the Sun newspaper that, in the event of a Conservative Party victory in Britain’s June 8 election, they would push through powers granting security services the right to hack encrypted messages within weeks.
The attack has certainly stepped up calls to tighten digital surveillance, but Britain’s current government has had end-to-end encryption in its sights for a while. Last year, Theresa May, then Britain’s Home Secretary, introduced the controversial Investigatory Powers Act, which grants government agencies the right to amass large quantities of personal data, including medical and tax records, on a level unknown elsewhere in Europe. As revelations from Edward Snowden in 2013 revealed, British security services had in fact been collecting and retaining this kind of data illegally for some time. The Investigatory Powers Act made this data collection legal, a move that Britain’s Open Rights Group damned as “more suited to a dictatorship than a democracy,” Snowden himself called it “the most extreme surveillance in the history of western democracy.”
But these new powers have an Achilles heel: messaging services offering end-to-end encryption, which remain inaccessible so far. Aware that terrorists can communicate through shielded messaging services such as WhatsApp, the U.K. government has since been placing pressure on tech companies to deliver encrypted messages from potential terror suspects to them—and lambasting their failure to cooperate. Talking to the BBC in March, current U.K. Home Secretary Home Secretary Amber Rudd laid blame at the doors of big tech companies who “understand the necessary hashtags to stop this stuff even being put up.”
There’s an obvious problem here. End-to-end encryption earns its name because messages sent using the system are irretrievable, un-hackable, and thus cannot be handed over by a messaging service. Suggesting that simply knowing “the necessary hashtags” will allow authorities to locate messages by terrorists misunderstands the technology. (It also, by the by, misunderstands what a hashtag is.)
Making encrypted messages accessible to security forces would require an overhaul of the technology itself. Tech companies would have to build an as-yet-nonexistent “back door” into their systems. And that door could then be accessed by hackers, threatening the personal information and privacy of users.
Such potential intrusion seems less flagrant, of course, when you bear in mind the threat it is intended to combat: If hacking the terrorist’s communications had genuinely been able to stop something as awful as children being killed at a concert, a few thousand compromised credit card numbers seem like a small price to pay.
There’s no clear evidence, however, that the transformation of Britain into a vast surveillance post would necessarily thwart further attacks. Terrorists are nothing if not adaptable. As numerous security experts have noted, they could easily switch to other communication systems, or revert to ones already used in the past. The perpetrators of the 2015 Paris attacks communicated via unencrypted SMS messages. Security forces failed to prevent the assault not because the attackers’ communications were inaccessible, but because Belgian security officials short on highly trained staff failed to communicate with their colleagues.
It’s this failure that should give Britain’s authorities pause. With police budget cuts of 22 percent between 2010 and 2015, Britain now has half the armed police officers it did fifteen years ago—a drop fueled by the difficulties of recruitment as well severe budget cuts. The purpose of the current army deployment is in part to relieve police from duties such as guarding public buildings, freeing up more staff for combating the current terror threat.
In the light of this drop, it’s easier to understand why the government is keen to push encryption-busting: It’s a lot easier to demand a high-tech anti-terror tool that doesn’t currently exist than it is to put more trained officers on the streets. That doesn’t mean the proposed internet crackdown will work. Granting government access to encrypted messages wouldn't just erode yet further Britain’s already embattled private sphere. It might well do so without making us safer or freer.