It’s more than two years now since the European Commission introduced its draft Child Sexual Abuse Regulation. It proposes sweeping measures to scan private and public communications for child sexual abuse material (“CSAM”), universally and without judicial warrant. These would be scanned for known and unknown CSAM, and for attempts to groom children by child-abusers. It would require messaging services to verify the age of their users, and app stores would have to ban access to messaging apps for under 16-year-olds. The draft has gone back and forth between the Commission, the European Parliament and the European Council, and at the time of writing there is still no version that can form the basis for final negotiation between the three bodies.
The proposal has received a lot of criticism from both privacy activists and anti-child sexual abuse campaigners, who have dubbed it “chat control”; looked at in detail its problems start to become clear. First of all, it wouldn’t catch most CSAM: this is typically exchanged through sharing of links and passwords for encrypted archives on dedicated message boards. Individual files aren’t usually shared, so there’s no point in scanning message attachments.
Moreover, what is proposed would be hugely impractical. Some automated scanning for CSAM already takes place, but it turns out that about 80% of the reported content is irrelevant once checked by human beings. If that was scaled up to include every message sent to or from the European Union, the torrent of false positives would be totally unmanageable. As Edward Snowden pointed out, mass surveillance doesn’t make the world a safer place: it only distracts the agencies tasked with sifting through the information.
It gets worse. Most messaging apps these days provide end-to-end encryption. The content is encrypted on the sender’s device and can only be decrypted on the receiver’s device: the provider’s servers see only encrypted gobbledygook. Scanning messages for CSAM would require the developers to include the scanning software in their apps, so that messages are scanned on the device before they are encrypted. That might work if the scanning code was simply comparing a file against a list of hashes of known CSAM files, but the legislation also includes requirements to scan hitherto unknown material, and to identify grooming activity. That would take some pretty sophisticated AI code, and I am sceptical that current smartphones would be up to the task. The only practical way of implementing this without dedicated AI hardware in the phone would be to transmit the data to centralised servers for scanning, thus negating the privacy that end-to-end encryption is meant to provide.
It would be national governments who would be responsible for enforcing this legislation, and it’s an iron law of civil liberties that once governments acquire legal powers for one purpose, they will extend those powers to cover others. The scan lists or AI code for the messaging apps could be easily extended to search for other data without oversight by either users or messaging developers. We already have the example of the Pegasus scandal, where spyware supposedly for hunting terrorists was deployed against opposition politicians and journalists. The Hungarian government was implicated in this, so it’s no surprise then that Hungary has used its presidency of the European Council to press the case for the Child Sexual Abuse Regulation vigorously. The threat wouldn’t only come from European governments: the latest news about how China used a private network intended for U.S. government evesdropping to infiltrate U.S. telecommunications shows how such “back doors” can be exploited by hostile powers.
The revelations about Pegasus caused an uproar three years ago; now the European Commission is proposing to install spyware on every phone in Europe. It is exploiting the emotive subject of child sexual abuse to attack our civil liberties. The European Union should be spending its time on genuinely effective and targeted measures against child sexual abuse.