Apple sued over abandoning CSAM detection for iCloud | AllTechnoNewz

Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit argues that by not doing more to prevent the spread of this material, it’s forcing victims to relive their trauma, according to The New York Times. The suit describes […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Post a Comment

Previous Post Next Post