Apple’s plan to fight child sexual abuse is sounding an alarm for privacy champions

Apple’s plan to fight child sexual abuse is sounding an alarm for privacy champions

In August, the Cupertino company shared plans to use “client-side scanning” (CSS) to detect child sexual abuse material by scanning images before they’re uploaded to its iCloud Photos storage service. It had the potential to be installed on more than a billion Apple devices globally. In their 46-page study, they poked holes in the technology’s efficacy, arguing that it poses “serious security and privacy risks for all society while the assistance it can provide for law enforcement is at best problematic.”