Apple sued by victims alleging it failed to stop spread of child sexual-abuse material
A new proposed class-action lawsuit seeks $1.2 billion in damages from Apple on the grounds that it failed to implement a feature that would have identified child sexual-abuse material (CSAM) on its customers’ iCloud accounts.
Apple announced the new on-device CSAM detection system in 2021, which planned to use a library of digital signatures of known CSAM imagery to match against images stored on iCloud, without sending photos to Apple.
But after significant pushback from privacy researchers and advocates, the company walked away from the plan out of fears the system could be abused by governments.
The complaint said:
“Not only did Apple fail to stop or limit the spread of known CSAM through its products, but it publicly announced that it affirmatively would not implement product design changes to stop or limit the spread of known CSAM through its products, thereby amplifying the already significant risk and harm to the Plaintiffs and Class members.”
Apple announced the new on-device CSAM detection system in 2021, which planned to use a library of digital signatures of known CSAM imagery to match against images stored on iCloud, without sending photos to Apple.
But after significant pushback from privacy researchers and advocates, the company walked away from the plan out of fears the system could be abused by governments.
The complaint said:
“Not only did Apple fail to stop or limit the spread of known CSAM through its products, but it publicly announced that it affirmatively would not implement product design changes to stop or limit the spread of known CSAM through its products, thereby amplifying the already significant risk and harm to the Plaintiffs and Class members.”