Apple’s plan to fight child sexual abuse is sounding an alarm for privacy champions

Apple’s plan to fight child sexual abuse is sounding an alarm for privacy champions
·4 min read

In August, the Cupertino company shared plans to use “client-side scanning” (CSS) to detect child sexual abuse material by scanning images before they’re uploaded to its iCloud Photos storage service. It had the potential to be installed on more than a billion Apple devices globally. In their 46-page study, they poked holes in the technology’s efficacy, arguing that it poses “serious security and privacy risks for all society while the assistance it can provide for law enforcement is at best problematic.”

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting