EU Lawmakers Scolded For Concealing Identities Of Privacy-busting Content-scanning 'experts'

Europe's government watchdog has found that the European Commission's refusal to disclose which experts it consulted on the proposal to scan encrypted communication for child sexual abuse material amounted to maladministration.

The decision by the European ombudsman, made last month and published this week, stems from a complaint filed in December 2022 by the Irish Council for Civil Liberties (ICCL), an advocacy organization.

The European Commission has been trying to formulate rules for preventing the sharing of online child sexual abuse material, rules known as the Regulation on Child Sexual Abuse (CSA). But critics contend the proposed legislation would have to compromise private encrypted communication.

"The commission’s legislation would enable member states to compel online platforms, including those offering end-to-end encrypted messaging, to scan users' content and metadata for CSA images or 'grooming' conversations and behavior, and where appropriate report them to public authorities and delete them from their platforms," a coalition of advocacy groups, technology companies, and technical experts explained in an open letter last year.

"Such a requirement is fundamentally incompatible with end-to-end encrypted messaging because platforms that offer such service cannot access communications content."

Apple came to a similar conclusion when it abandoned its own proposal to implement a system to scan for CSA material on its devices. The UK, meanwhile, passed its own Online Safety Bill that authorizes telecom regulator Ofcom to demand decryption – even as officials have acknowledged that's not technically feasible at the moment.

Yet the European Commission still appears to believe there's a way to have bypassable strong end-to-end encryption while respecting privacy rights and maintaining operational security, flying in the face of mathematics. And it has evidently persisted in that belief based on consultations with so-called experts.

But the commission refused to identify the experts who helped draft the text related to scanning encrypted communications – which prompted the ICCL complaint.

"Numerous experts have warned that it is not technically feasible," the ICCL said in response to the Ombudsman's decision.

"Public interest technologists and more than 450 academics have warned in public that 'technology to detect CSAM in encrypted content is currently not mature and will not be mature in the next two to five years.' This is in stark contrast to the views put forward by experts relied upon by the Commission, whose names the Commission is refusing to reveal."

The ICCL expressed concern about the commission's lack of transparency because of allegations about ties between the commission and commercial lobbyists.

In September, a report from Balkan Insight – an investigative non-governmental organization – traced how the European CASM proposal has been supported by organizations like Thorn that stand to benefit by providing content-scanning software.

On Tuesday, the commission's coyness became less of an obstacle. Member of the European Parliament Patrick Breyer published the list of experts on Mastodon, and Berlin-based advocacy group Netzpolitik also did so.

The list of consultants includes five individuals from an organization providing CSAM scanning tools. It also features academics from the Stanford Internet Observatory; industry technologists from Google and Microsoft; representatives from the National Center for Missing and Exploited Children (NCMEC); and agents of the Australian Federal Police, the Spanish Civil Guard, the UK's National Cyber Security Center (NCSC) and Government Communications Headquarters (GCHQ), and Europol.

Pointing to the findings of the Balkan Insights report, Ross Anderson, professor of security engineering at the UK's University of Cambridge, characterized the European Commission's refusal to publish the names of consulting experts as part of an broad attempt to undermine encryption by businesses that would benefit from content scanning contracts. This was aided by government intelligence services that fear being unable to listen in on important conversations happening through encrypted communications tools, such as Signal.

"We now have crypto war 3.0 coming upon us," Anderson told The Register in an interview, "because His Majesty the King advanced yesterday in his speech from the throne that there's going to be an Investigatory Powers Bill amendments act which, among other things, will give his majesty's ministers the powers to demand that … if you want to sell your wares in Britain and you propose to include any see new security features, you've got to disclose them to His Majesty's government first." ®

RECENT NEWS

From Chip War To Cloud War: The Next Frontier In Global Tech Competition

The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more

The High Stakes Of Tech Regulation: Security Risks And Market Dynamics

The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more

The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics

Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more

The Data Crunch In AI: Strategies For Sustainability

Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more

Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser

After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more

LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue

In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more