Simon Hunt

WhatsApp is right to be angry about the UK’s encryption mess

The world’s biggest tech firms have lined up to lambast the latest incarnation of the Online Safety Bill and Investigatory Powers Act. Many, including Apple and Meta, are threatening to withdraw products and services from the UK if the proposed rules become law. The Home Office could become the ‘de facto global arbiter of what level of data security and encryption are permissible’, Apple says. They have a point.

The government wants to force companies to scan the content of its users’ encrypted messages for harmful content, as well as getting advanced notice to approve any future software updates that are security related. The aim – a noble one – is to ensure that bad actors don’t use encryption to get access to illegal content, in particular, the sharing of child sexual abuse material.

But all of this belies the original intent of encryption, namely, to guarantee the safety of our online lives. Almost everything we do on the web, from chatting with friends to sharing client data at work, relies on some form of encryption. It is a technology that armies of engineers have taken years to perfect.

The very principle of the technology, and the protections it affords that we all rely on, risk being unravelled in an attempt to make one area of the internet safer.

A recent open letter by academics on the danger of eroding encryption puts it best: ‘There is no technological solution to the contradiction inherent in both keeping information confidential from third parties and sharing that same information with third parties.’

That, in a nutshell, is the predicament the government faces: you either protect people from encryption or protect them with encryption, but you cannot do both.

Even if the government magically squares this circle, do we really trust the state to be the sole governor of our data and where it goes?

Only last week, the Electoral Commission admitted hackers had got access to the personal data of tens of millions of citizens by infiltrating its systems. That doesn’t exactly instil confidence.

There is also the minefield of deciphering when explicit content is legal and when it isn’t. Algorithms can be designed to do this with some accuracy, but none will be perfect. Police officers could end up viewing private imagery that breaks no laws – and who wants to give them permission to do that?

No one can doubt the importance of fighting against the spread of child sexual abuse material, and no one can doubt the good intentions of the proposed rules in this fight. Yet the tin-eared, slapdash approach to legislation by ministers and regulators is the latest example of the constant government overreach businesses complain about.

A key promise of the Leave campaign was that Brexit would unshackle the UK from burdensome regulations imposed by Brussels. So far though, that unshackling has liberated regulators to crack down on firms it doesn’t like.

The Competition and Markets Authority, has, embarrassingly, stood out like a sore thumb globally in its determination to block Microsoft’s Activision merger (the EU gave it a green light), while founders of London’s biggest fintechs tell me they are fed up with an increasingly ham-fisted approach to regulation, and are looking at options overseas to start their next business.

The government is in a constant muddle when it comes to big tech. One day, it cheers its success in welcoming the latest businesses to our shores, the next day, it cheers how successfully it is fending them all off. It needs to make its mind up.