WhatsApp says it would rather face a UK ban than weaken its security
Popular messaging app WhatsApp has said it would take the prohibition of its service in the UK over compliance with the government’s proposed Online Safety Bill.
It believes the bill, if enforced, would weaken the privacy of its service for users by undermining its end-to-end encryption, which ensures that no one other than the sender and receiver of messages on the platform is able to view their contents.
The controversial bill is aimed at tackling the increasing propagation of child-abuse material, by allowing the nation’s communications regulator, Ofcom, to require encrypted messaging apps to use ‘accredited technology’ to identify and remove such material.
Undermining privacy
Head of WhatsApp Will Cathcart said that “98% of our users are outside the UK, they do not want us to lower the security of the product,” adding that “we’ve recently been blocked in Iran, for example. We’ve never seen a liberal democracy do [this].”
Signal, another popular privacy app, has already threatened to leave the UK should the bill become law. CEO Meredith Whittaker tweeted in support of WhatsApp, saying that she looked forward to working Cathcart and others to “push back” against the bill.
Cathcart believes the UK is setting a bad example for other liberal democracies to follow, saying that “when a liberal democracy says, ‘Is it OK to scan everyone’s private communication for illegal content?’ that emboldens countries around the world that have very different definitions of illegal content to propose the same thing.”
He also added his concerns that other countries may have their own definitions of illegal content that they ask messaging services to scan for: “If companies… scan the content of [users’] communications against a list of illegal content, what happens when other countries show up and give a different list of illegal content?”
On the other side of the table, the UK government and the the National Society for the Prevention of Cruelty to Children (NSPCC) argue that encryption of messages prevents them from stopping child abuse content spreading online.
“It is important that technology companies make every effort to ensure that their platforms do not become a breeding ground for paedophiles,” the Home Office said.
Richard Collard, the Associate Head of Child Safety Online Policy at the NSPCC, added that the bill “will rightly make it a legal requirement for platforms to identify and disrupt child sexual abuse taking place on their sites and services.”
He also said that these companies could develop “technological solutions” that protect users privacy whilst at the same time ensuring the safety of child abuse victims.
He claimed that “experts have demonstrated that it’s possible to tackle child-abuse material and grooming in end-to-end encrypted environments”.
The UK government clarified that end-to-end encryption is not being banned, and that privacy and child safety are not mutually exclusive in an online context.
However, critics argue that the only way to check for illegal content is to scan the messages on a user’s device with an additional service, meaning that the contents of their messages are no long private.
Lawyer Graham Smith compared it to digging a hole to bypass a fence without breaking it, tweeting “once the hole has been dug, you might as well not have the fence.”
Dr Monica Horten of the Open Rights Group that the bill could turn WhatsApp into a “mass-surveillance tool” as every users’ messages could potentially be scanned.
The Information Commissioner’s Office, which is apparently working closely with Ofcom, told BBC News that “”Where less intrusive measures are available, they should be used,” and supported “technological solutions that facilitate the detection of illegal content without undermining privacy protections for everyone”.
Use the best business VPN to keep your remote employees on a private connection