Apple has hinted it might not revive its controversial effort to scan for CSAM (child sexual abuse material) photos any time soon. MacRumorsnotes Apple has removed all mentions of the scanning feature on its Child Safety website. Visit now and you’ll only see iOS 15.2’s optional nude photo detection in Messages and intervention when people search for child exploitation terms.
It’s not certain why Apple has pulled the references. We’ve asked the company for comment. This doesn’t necessarily represent a full retreat from CSAM scanning, but it at least suggests a rollout isn’t imminent.
The CSAM detection feature drew flak from privacy advocates as it would flag on-device scans that could be sent to law enforcement. While Apple stressed the existence of multiple safeguards, such as a high threshold for flags and its reliance on hashes from private organizations, there were concerns the company might still produce false positives or expand scanning under pressure from authoritarian governments.
Apple delayed the rollout indefinitely to “make improvements.” However, it’s now clear the company isn’t in a rush to complete those changes, and doesn’t want to set expectations to the contrary.
Brought to you by USA Today Read the rest of the article here.