Apple has quietly removed from its website all references to its child sexual abuse scanning feature, months after announcing that the new technology would be baked into iOS 15 and macOS Monterey.
Over thousands of CSAM (child sexual abuse materials) victims are now taking the fight against Apple after the company ultimately decided to skip adding tools that will help detect it on their ...
In December, Apple said that it was killing an effort to design a privacy-preserving iCloud photo scanning tool for detecting child sexual abuse material (CSAM) on the platform. Originally announced ...
Apple has suddenly trapped itself in a security and privacy nightmare, just as iPhone 13 hits the streets. This now threatens to damage the next 12-months leading to iPhone 14 and is starting to look ...
Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following ...
Apple and Microsoft have provided details of their methods for detecting or preventing child sexual abuse material distribution, and an Australian regulator has found their efforts lacking. The ...
Posts from this author will be added to your daily email digest and your homepage feed. Two of the three safety features, which released earlier this week with iOS 15.2, are still present on the page, ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results