Tech companies and privacy hawks are pushing back in both the UK and the EU on mandatory rules to catch the distribution of child pornography over fears such systems could be used for broader surveillance online.
Concerns from privacy activists, and the fear that detection software isn’t ready to go, are hampering efforts on both sides of the Channel to put a stop to the dissemination of child pornography.
Rulemaking has even prompted big messaging services such as WhatsApp and Signal to threaten to exit the British market.
The global network of national hotlines INHOPE, to which people can report child pornography, processed nearly 2.5 million suspected illegal images in 2024, an 218% increase from 2023.
Despite the urgent need to stop the spread of such images on the internet, sometimes referred to as child sexual abuse material (CSAM), regulating the online space has been difficult as tech companies claim scanning encrypted messages would endanger everyone’s privacy.
In the EU, technical talks are continuing in the Council over efforts to regulate the online space, with a latest compromise text obtained by Euractiv. However, progress is slow and bogged down in debate over how far to push tech firms to police their platforms.
EU law still in the making
The Commission proposed legislation in 2022 to force tech platforms to scan their services for CSAM, when ordered to do so by a court. Early detection would help law enforcement identify people disseminating such images.
Some online platforms and messaging services, along with privacy advocates, are, however, concerned that scanning encrypted messages for CSAM would create so-called ‘back doors’ in tech services and could lead to mass surveillance.
End-to-end encryption does not allow messages from being accessed by anyone outside the sender or receiver. WhatsApp and Signal provide such services and are widely used by public institutions to secure sensitive information, but they also create a safe heaven for criminals.
No technology currently exists that can scan for CSAM while preserving people’s privacy.
In 2023, the European Parliament made its own assessment of the Commission’s legislative proposal, which revealed it was unlikely that a technical solution would be found in the next two to five years.
This year, the Commission seemingly conceded this to still be the case, as it announced a roadmap to “identify and assess” technological solutions to enable law enforcement to by-pass encryption without undermining privacy and cybersecurity of services.
Since the law was proposed, the legislative process has been slow, in particular in the Council. EU countries are deeply divided on what responsibility to put on tech companies to fight CSAM despite existing technical limitations.
The UK’s U-turn
In the UK, the Online Safety Act (OSA), which is already in place, mandates tech companies to detect CSAM, including when services are encrypted.
After WhatsApp and Signal threatened to ditch the national market over the rules, the government conceded that its law would not be enforceable for services where scanning is not “technically feasible.”
A spokesperson from the regulator Ofcom, confirmed to Euractiv that this would concern “a small minority” of services. They did not comment on whether this referred to encrypted services specifically.
UK-based child-rights advocacy group NSPCC, told Euractiv that leaving some services out of scanning obligations was an “unacceptable loophole.”
Privacy advocates are equally unhappy, as they worry Ofcom could still use its enforcement powers to mandate companies to create back doors to their services to scan for CSAM.
Shifting the burden on Big Tech
In the UK, tech companies must demonstrate to Ofcom that their services cannot be scanned. The government however hopes that they will also invest in new technologies to detect CSAM.
UK Safeguarding Minister Jess Phillips, told Euractiv in an interview that the market would incentive the tech sector to develop such technologies. “I think that the parent market is bigger than the paedophile one” she said.
EU countries are also considering mandating tech companies to develop detection technology, instead of ordering them to scan encrypted services.
But tech companies and privacy groups continue to argue that this is not possible. They contend that creating back doors to encryption is not only a technical issue, but also about preventing governments from accessing their citizens’ private information.
Manager at Open Rights Group James Baker, told Euractiv that Ofcom’s powers are “more suited to an authoritarian regime not a democracy.”
[CP, MK, JP]