Show an ad over header. AMP

I am the FIRST

Apple's plans to identify child abuse images on iPhones stir praise and protests

Apple's plan to detect images of child sexual abuse on iPhones and to shield some underage users of Messages from receiving explicit images has touched off the latest round of a perennial debate over prioritizing law enforcement or user privacy.

Why it matters: There's increasing pressure on giant tech platforms to flag illegal behavior and remove harmful content. But smartphones are also powerful tools of surveillance that are increasingly employed by authoritarian governments and invasive marketers to target users around the world.


What's happening: Apple's new system, announced Thursday, will use cryptographic hashes to identify illegal images that users are uploading to Apple's iCloud without Apple directly snooping in users' troves of photos.

  • It will also use on-device machine learning to flag sexually explicit photos sent via Apple's Messages service by or to users with family accounts.

What they're saying: Child safety organizations applauded Apple's announcement.

  • “Apple’s expanded protection for children is a game changer," John Clark, president and CEO of the National Center for Missing and Expoited Children, said in a statement. "With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material."
  • "The commitment from Apple to deploy technology solutions that balance the need for privacy with digital safety for children brings us a step closer to justice for survivors whose most traumatic moments are disseminated online," Julie Cordua, CEO of Thorn, an international anti-human trafficking nonprofit, said in a statement.

Yes, but: Organizations focused on online privacy — including the Center for Democracy & Technology and the Electronic Frontier Foundation — raised concerns that the technology Apple was deploying to flag illegal child sexual abuse material (CSAM) would get redeployed to detect other kinds of content.

  • Several thousand people signed an open letter posted via Github arguing that "Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products."
  • Will Cathcart, head of Facebook's encrypted messaging service WhatsApp, tweeted: "I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world. People have asked if we'll adopt this system for WhatsApp. The answer is no."

The bottom line: Daring Fireball's John Gruber praised the intent and design of Apple's system but acknowledged "completely legitimate concerns from trustworthy experts about how the features could be abused or misused in the future."

  • Alex Stamos of the Stanford Internet Observatory tweeted, "I find myself constantly torn between wanting everybody to have access to cryptographic privacy and the reality of the scale and depth of harm that has been enabled by modern comms technologies."

regular 4 post ff

infinite scroll 4 pff

Insights

mail-copy

Get Goodhumans in your inbox

Most Read

More Stories