Apple Cancels Plans for CSAM Detection Tool for iCloud Photos Apple Cancels Plans for CSAM Detection Tool for iCloud Photos On Wednesday, Apple announced that it won’t be moving forward with its previously proposed child sexual abuse material (CSAM) detection tool for iCloud Photos. The controversial tool would have allowed Apple to check iPhones, iPads, and iCloud Photos for CSAM. But critics objected to the feature because of its serious implications on privacy. Apple Not [...] This post Apple Cancels Plans for CSAM Detection Tool for iCloud Photos appeared first on The Mac Observer
Author: Arnold Zafra
Add your comment about this publication