UK Online Safety Bill To Become Law – and Encryption Busting Clause Is Still There

UK Parliament has passed an Online Safety Bill offering the government powers to introduce online child protection laws, one that includes clause 122, the infamous "spy clause," albeit with some caveats.

The most recent published version of the bill includes the contentious passage that if comms regulator Ofcom considers "that it is necessary and proportionate to do so, they may give a notice described in subsection (2), (3) or (4) relating to a regulated user-to-user service or a regulated search service to the provider of the service." In theory, this could also encompass private communications that make use of encryption technology provided by platforms.

In February, encrypted chat service Signal threatened to end UK operations if the British government did not reconsider its stance. Later in April, other end-to-end encryption platforms including Element, Session, Threema, Viber, WhatsApp, and Wire urged UK lawmakers to rethink the bill.

Under the new legislation, UK media and communication regulator Ofcom could fine companies up to £18 million ($22.28 million) or 10 percent of their global annual revenue, whichever is greater. The sections about media and messaging appear watered down after the most recent amendments after being pored over by the Lords and the House of Commons.

In a statement, the government said the bill gives people powers to make sure illegal content is removed. It will also place a legal responsibility on social media platforms to enforce the promises they make to users when they sign up, through terms and conditions. Lastly, the legislation will give users the option to filter out harmful content, such as bullying, that they do not want to see online.

Backdoor blues

As we reported earlier this month, there was an admission from the government that breaking encryption would not be technically "feasible," meaning it could not be enforced in such a manner.

During a Lords debate, Lord Parkinson of Whitley Bay, from the ruling Conservative party, said: "Ofcom can require the use of a technology by a private communication service only by issuing a notice to tackle child sexual exploitation and abuse content under Clause 122. A notice can be issued only where technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content orders to scan user files."

Notices would also have to comply with Human Rights Act 1998 and the European Convention on Human Rights, he said.

However, on the bill's passing yesterday, civil liberties non-profit the Electronic Frontier Foundation said: "Given the text of the law, neither the government's private statements to tech companies, nor its weak public assurances, are enough to protect the human rights of British people or internet users around the world."

Earlier this month, the EFF warned the bill could, in certain situations, give the government powers to make online platforms use government-approved software to search through all users' photos, files, and messages, scanning for illegal content.

It said such a backdoor scanning system could be exploited by bad actors and produce false positives, leading to false accusations. However, the EFF also conceded the government may not be able to enforce some aspects of the bill.

Meanwhile, countering Lord Parkinson's earlier comments, UK technology minister Michelle Donelan told newswire Reuters shortly after this that the government would, if necessary, require tech platforms to "work to develop technology to scan encrypted messages as a last resort."

We don't want to hurt privacy, claims gov.UK

In a statement, the government said it wanted tech giants to develop their own systems for detecting illegal content without compromising privacy.

"The government is not requiring companies to use 'off the shelf' products and where appropriate encourages firms to use their vast engineering and technical resources to develop solutions that work for their own platforms. Technical solutions should be designed with a particular focus on safeguarding people's right to privacy and data protection," it said.

Speaking to the BBC, Meredith Whittaker, president of Signal and an outspoken critic of the bill, said the company was "more optimistic than we were when we began engaging with the UK government."

She also warned that the government should show a commitment that the "unchecked and unprecedented power" in the bill could not be used to undermine private communications.

Ben Packer, partner at law firm Linklaters, said the Online Safety Bill was vast in its scope and application.

"When in force, it will impose a broad range of obligations on social media platforms, search engines and other services that facilitate contact between their users. These online platforms will be subject to new obligations concerning illegal content or harmful content that may be accessed by children," he said.

"The bill tackles many of the same underlying issues that the European Union's Digital Services Act does – but in a very different way. With the DSA having recently come into effect for the largest platforms, many online services are already thinking about how to adapt their compliance processes to meet the requirements of both the DSA and OSB – and, indeed, the myriad other content regulation regimes coming into effect globally."

Packer noted that Ofcom was already prepared to enforce the bill, given that government had provided it with pre-legislative funding to pay for more than 300 staff from the industry and a range of other regulators and law enforcement bodies. "Ofcom have published a detailed roadmap for the Online Safety Bill taking effect and are promising to move quickly as soon as the bill completes its passage through Parliament," he said. ®

RECENT NEWS

From Chip War To Cloud War: The Next Frontier In Global Tech Competition

The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more

The High Stakes Of Tech Regulation: Security Risks And Market Dynamics

The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more

The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics

Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more

The Data Crunch In AI: Strategies For Sustainability

Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more

Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser

After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more

LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue

In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more