It’s been over a year since Apple announced plans for three new child safety featuresIncluding a system to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, an option to blur sexually explicit photos in the Messages app, and child abuse resources for Siri. The last two features are now available, but Apple remains silent on its plans for the CSAM detection feature.
Apple initially said that CSAM detection will be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company finally delayed the feature It is based on “feedback from clients, advocacy groups, researchers, and others.”
In September 2021, Apple released the following update to its own account: Child Safety page:
We have previously announced plans for features aimed at protecting children from predators that use communication tools to recruit and exploit them, and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we’ve decided to take additional time in the coming months to gather input and make improvements before releasing these critical child safety features.
In December 2021 Apple removed the above update and all references to CSAM detection plans From the Child Safety page, but from an Apple spokesperson knowledgeable Boundary Apple’s plans for this feature had not changed. As far as we know, Apple has since made no public comment on the plans.
We reached out to Apple to ask if the feature is still planned. Apple did not immediately respond to a request for comment.
Apple has moved forward by implementing child safety features for the Messages app and Siri With the release of iOS 15.2 and other software updates in December 2021 and expanded the Messages app feature to: Australia, Canada, New Zealand and the United Kingdom With iOS 15.5 and other software versions in May 2022.
Apple said the CSAM detection system was “designed with user privacy in mind.” The system performs “on-device mapping using a database of known CSAM image hashes” from child safety organizations, which Apple will convert to “an unreadable hash set that is stored securely on users’ devices.”
Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a nonprofit that works in collaboration with US law enforcement. Apple said there would be a “threshold” that would provide “less than a one-trillionth chance per year” for an account to be incorrectly flagged by the system, and that any flagged accounts would also be manually reviewed by a human.
Apple’s plans have been criticized by a wide variety of individuals and organizations. security researchers, Electronic Frontier Foundation (EFF), politicians, policy groups, university researchersand even some Apple employees.
Some critics have argued that Apple’s child safety features could create a “backdoor” on devices that governments or law enforcement could use to spy on users. Another concern was false positives; including the possibility that someone intentionally added CSAM images to another person’s iCloud account to flag their account.
Note: Due to the political or social nature of the discussion on this topic, the discussion thread is on our page. political news forum. All forum members and site visitors can read and follow the thread, but posts are limited to forum members with at least 100 posts.