Skip to content

Apple Delays Rollout of Snoopy Phone Scanning; Will It Drop It Entirely?

In response to an uproar against its plans to implement a system to scan users’ iPhones for child sexual abuse material (CSAM), Apple has announced that it will rethink its proposal:

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material [CSAM]. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Although the system was designed to detect what is essentially universally considered to be detestable content, it presented enormous problems that would make it ineffective for its stated purpose, liable to political expansion to other prohibited content, and prone to abuse in generating false reports.

• If Apple relied on a list of hashes (not images themselves) provided by security agencies, what would prevent those agencies from having phones scan for images representing political positions it opposed?

This post is for paying subscribers only

Subscribe

Already have an account? Sign In