Skip to content

Apple’s CSAM move might just invade your privacy: Experts suggest

  • by
  • 3 min read

Apple’s plans to introduce Client-Side Scanning (CSS) in US iPhones to search for child sexual abuse images might just open the door to mass surveillance and leave millions open the exploits. 

CSS enables on-device analysis of data on your device in the clear. In an ideal world, if something suspicious is found on a phone, its existence and potentially the source can be reported to concerned agencies; otherwise, little to no information leaves the phone.

However, academics at Harvard Kennedy School, Massachusetts Institute of Technology (MIT) and the University of Cambridge, among others, argue otherwise, stating that using CSS can create serious security and privacy risks. At the same time, the assistance it provides is “problematic.”

In the News: HTC announces immersive Vive Flow glasses to “go with the flow”

Protection against abuse or abuse in the name of protection?

While supporters of CSS may argue that it solves the age-old encryption versus public safety debate in the sense that it’s end to end encrypted and aids in investigating some pretty serious crimes. However, CSS by design is open to vulnerabilities. 

The researchers mentioned above have released a 46 page paper against CSS stating the technology’s risks. “CSS neither guarantees efficacious crime prevention nor prevents surveillance. Indeed, the effect is the opposite. CSS, by its nature, creates serious security and privacy risks for all society, while the assistance it can provide for law enforcement is at best problematic. There are multiple ways in which client-side scanning can fail, can be evaded, and can be abused.”

The report also states that the European Union planned to use this technology to scan for other images even before Apple revealed their system.

Apple planned to use a technique called “perpetual hashing” to compare photos on iPhone users with known images of child abuse when said images were uploaded to the cloud. If 30 or more matching images are found, there will be a manual review before passing the issue to the concerned law enforcement agency. 

Apple was forced to pause implementation following a backlash by privacy activists last month. However, as reported by The Vergeresearchers managed to construct totally different images that produced the same fingerprint and would’ve appeared as false positives in Apple’s system — demonstrating just how ineffective the system was. 

Apple, of course, passed on the finding as not a threat to their systems’ integrity. Not only this, The Guardian reports that several others managed to do the reverse and changed the mathematical output of the image without changing the actual image, creating false negatives. 

These aren’t the only ways people can dodge the system. The report’s authors say that people might as well try to disable scanners or avoid using phones with CSS altogether.

“The software provider, the infrastructure operator, and the targeting curator must all be trusted. If any of them – or their key employees – misbehave or are corrupted, hacked or coerced, the security of the system may fail.”

In the News: WhatsApp adds end-to-end encryption for chat backups

Yadullah Abidi

Yadullah Abidi

Yadullah is a Computer Science graduate who writes/edits/shoots/codes all things cybersecurity, gaming, and tech hardware. When he's not, he streams himself racing virtual cars. He's been writing and reporting on tech and cybersecurity with websites like Candid.Technology and MakeUseOf since 2018. You can contact him here: