Show an ad over header. AMP

I am the FIRST

Apple opens the encryption Pandora's box

Apple's plan to scan iPhones for child sexual abuse material (CSAM) provoked immediate criticism that it was opening a door to much broader efforts by governments seeking a way into citizens' devices.

Between the lines: That debate is important, but Apple is also laying out a technical approach that's worthy of the industry's attention.


  • Apple's scheme does some work in the cloud and other work on the device, and the two systems share information only under strictly defined circumstances. That could help preserve privacy by creating and sharing less user data.

Driving the news: Apple last week announced its plan to begin scanning iPhones in the U.S. to see if they contain material that has been flagged as illegal by the National Center for Missing and Exploited Children. A separate change would allow parents to be notified if children under 13 are sent nude images.

  • Critics immediately slammed the moves, saying that, however well intended, such systems would inevitably be used toward other ends, such as authoritarian governments spying on their opponents.
  • In a New York Times op-ed on Thursday, Matthew Green and Alex Stamos urged Apple to hold off implementing the planned moves until they could be studied by researchers to better understand their risks.
  • Apple employees "have flooded an Apple internal Slack channel with more than 800 messages," many criticizing the plan, per Reuters.

The big picture: Much of the debate mirrors past encryption controversies, in which encryption proponents have argued that any kind of exception or back door creates vulnerabilities that will be exploited by bad actors, so you might as well not bother using encryption at all.

Indeed, critics of Apple's approach here say that once it starts scanning devices on the client side, it really won't be offering end-to-end encryption at all.

  • "Once they’ve built this door, the policy choices that are designed to limit how it can be used are insufficient to provide the level of security that was previously provided," said Sharon Bradford Franklin, co-director of the security and surveillance project at the Center for Democracy and Technology.
  • CDT issued its own paper this week suggesting different tools that can co-exist with full end-to-end encryption, including user reporting of problematic content and analysis of metadata.
  • Will Cathcart, head of Facebook-owned messaging app WhatsApp, also blasted Apple's move.
  • "There is no way to make technology work for 'good reasons' only," Cathcart told Axios. "We're concerned that creating the power to scan people's private photos or documents on their devices to make reports to governments is going to lead to long term abuse. This is a surveillance system that many governments will want to control, including China."

My thought bubble: The immediate blowback suggests that Apple either didn't get the balance right in this instance, or did a bad job of communicating its system, or both.

  • However, Apple's plan does put forward a useful idea that bears consideration in future system designs.
  • With this system, Apple isn't just deploying a single broad tool for scanning devices. Instead, it's creating multiple systems that only create data that can be shared when a certain threshold is reached. While still problematic, such an approach creates far less data from far fewer users than more broad-brush approaches would.

Apple has explored this in other areas as well — including the system that it created with Google to notify users of potential COVID-19 exposure. A mix of information on a device and in the cloud ensured that only a narrow amount of new data about users' health and location was created, and even less was shared.

  • Apple's new CSAM tool is obviously different. The COVID-19 system was opt-in, while Apple will use the new CSAM detection system for all customers who use iCloud photo sharing. (Users who don't use iCloud won't have their photos screened.)

Even those who criticize Apple over its new CSAM detection feature acknowledge there is some benefit to Apple's approach.

  • "If the choice must be between a narrow backdoor with policy limits to minimize its reach and application, versus a complete abandonment of encryption, absolutely the former is preferable," Franklin said.

regular 4 post ff

infinite scroll 4 pff

Insights

mail-copy

Get Goodhumans in your inbox

Most Read

More Stories