skip to Main Content

Europe’s hope to scan devices for illegal files criticized • The Register

While Apple has, temporarily at least, backed off from last year’s plan to run client-side scanning software (CSS) on customers’ iPhones to detect and report child abuse content (CSAM) to authorities, EU officials in May proposed rules to protect children that involve the same much-criticized approach.

The European Commission has suggested several ways to deal with child abuse images, including digitizing private online communications and breaking encryption. He did so undeterred by an article last October by 14 prominent IT and security experts dismissing CSS as a source of serious security and privacy risks.

In response, a trio of academics aim to make it clear how inefficient and rights-violating CSS would be for those who missed the memo the first time around. And the last time, and the time before.

In an ArXiv paper titled “YASM (Yet Another Surveillance Mechanism)”, Kaspar Rosager Ludvigsen and Shishir Nagaraja, from the University of Strathclyde, and Angela Daly, from the Leverhulme Research Center for Forensic Science and Dundee Law School, Scotland, revisit CSS as a means of ferreting out the CSAM and concluding that the technology is both ineffective and unwarranted.

Client-side scanning in this context involves running software on people’s devices to identify illegal images – typically those related to child exploitation, but EU lawmakers have also considered using CSS for flagging content related to terrorism and organized crime.

Apple’s approach was to use its NeuralHash machine learning model to calculate an ID for images to sync to iCloud against a list of known CSAM IDs. And it didn’t work very well when security researchers discovered they could create hash collisions with non-CSAM images. European officials did not opt ​​for a specific technical approach, but as far as the authors of the article are concerned, CSS is not up to the task.

Ludvigsen, Nagaraja, and Daly argue that CSS can no more prevent the distribution of CSAM than virus scanning can prevent the distribution of malware.

Even if you assume, they argue, that a CSS system has detected all CSAMs it has encountered – an unrealistic assumption – there is no clear definition of CSAM. There is a legal definition, they say, but it cannot be translated into rules for a CSS system.

Thus, adversaries will respond to the CSAM scan by finding ways to create images that evade detection.

“CSS contains in its very notion constant monitoring of the system and, unlike pure logging, attempts to monitor all events within a given frame,” the boffins explain. “That makes it very similar to software like antivirus, which we can’t be ‘perfect’ because the definition of malware can never define all types that exist.”

Moreover, the researchers say that the cost of trying to solve CSAM far outweighs the benefits, and this will likely be the case regardless of how technology evolves. Presumably, it would be beneficial to find CSAM images loaded onto phones by child exploiters unaware that their devices are now monitoring for the state, but these would be eclipsed by constantly violating the privacy rights of others and by denying everyone the benefits of encryption.

“Surveillance systems are well known for violating rights, but CSS presents systems that will routinely or constantly do so, which is why we find them dangerous and cannot justify [them] by the objectives they aim to serve”, argue the computer scientists.

They are, however, confident that EU lawmakers will try to move forward with some sort of CSAM digitization system, so they also attempted to explain the legal issues they plan to follow.

“We find that CSS systems will violate several rights within the European Convention on Human Rights, but our analysis is not exhaustive,” the researchers say in their paper. “They will likely violate the right to a fair trial, in particular the right to silence and against self-incrimination, the right to privacy, and if implemented further than the current examples, the freedom of assembly and association as well.”

For example, a trial cannot be fair, researchers say, if defendants cannot easily challenge evidence produced by an undisclosed algorithm. There is always the possibility that the images were planted by the authorities or fabricated or uploaded as a result of entrapment.

The authors go on to chastise the European Commission for the techno-solutionist belief that CSS is the only possible way to combat CSAM. The Commission, they say, “fails to consider and analyze the potential consequences that CSS or server-side analysis would have on cybersecurity and privacy, while justifying that the potential positive victim outcomes outweigh the negatives of all others”.

The researchers conclude that CSS is just too disruptive.

“If you want to dig for gold, you have to accurately predict where it is,” they say. “What you don’t usually do is dig into the whole crust of the earth’s surface. CSS systems and mass surveillance represent the latter.” ®

Back To Top