According to the latest news, Apple is going to delay its child protection features which were announced last month following intense criticism. Now, the changes have been rescheduled for later this year.
Apple said in a statement “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple’s original press release has a similar statement at the top of the page. However, the press release details three major changes in the works: one change to Search and Siri would point to resources to prevent CSAM if a user searched for information related to it, one would alert parents when their kids were receiving or sending sexually explicit photos and would blur those images for kids and the other would have scanned images stored in a user’s iCloud Photos for CSAM and report them to Apple moderators.
In the press release, Apple detailed the iCloud Photo scanning system in length to argue that it does not weaken any user privacy. Many privacy and security experts heavily criticized the company for the newly proposed system, arguing that it could have created an on-device surveillance system.
Back in August, in a statement, The Electronic Frontier Foundation said the new system is well intended but it would “break key promises of the messenger’s encryption itself and open the door to broader abuses.”