Advertisement
UK markets open in 2 hours 5 minutes
  • NIKKEI 225

    38,301.05
    +98.68 (+0.26%)
     
  • HANG SENG

    18,538.57
    +224.71 (+1.23%)
     
  • CRUDE OIL

    79.43
    +0.44 (+0.56%)
     
  • GOLD FUTURES

    2,320.90
    -1.40 (-0.06%)
     
  • DOW

    39,056.39
    +172.13 (+0.44%)
     
  • Bitcoin GBP

    49,241.25
    -892.84 (-1.78%)
     
  • CMC Crypto 200

    1,310.81
    +16.13 (+1.25%)
     
  • NASDAQ Composite

    16,302.76
    -29.80 (-0.18%)
     
  • UK FTSE All Share

    4,544.24
    +21.25 (+0.47%)
     

Apple indefinitely delays introduction of photo scanning features after widespread outcry

 (AFP via Getty Images)
(AFP via Getty Images)

Apple has indefinitely delayed the introduction of its new anti-child abuse features, following widespread outcry from privacy and security campaigners.

The company had said that the two new tools – which attempt to detect when children are being sent inappropriate photos, and when people have child sexual abuse material on their devices – were necessary as a way to stop the grooming and exploitation of children.

But campaigners argued that they increased the privacy risks for other users of the phone. Critics said that the tools could be used to scan for other kinds of material, and that they undermined Apple’s public commitment to privacy as a human right.

ADVERTISEMENT

Now Apple said that it will indefinitely delay those features, with a view to improving them before they are released.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” Apple said.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple had never given any specific indication of when the new features would be introduced. While it said that they would arrive with a version of iOS 15 – expected to be pushed out to iPhone and iPad users this month – it suggested that they might be introduced at some point after that initial launch.

Likewise, it gave no indication of how long the new consultation process would take, or whether it expected substantial changes to the system before it is released.

Apple announced the changes – which are made up of three new features – in early August. It said that it would add new information to Siri and search if people looked for child sexual abuse material (CSAM); that it would use the phone’s artificial intelligence to look at pictures sent to children and warn their parents if they appeared to be receiving inappropriate images; and that it would compare images uploaded to iCloud Photos with a database of known CSAM images, and alert authorities if they were found.

Apple stressed that all of the changes were intended to preserve privacy. It said that the scanning of photos happened purely on the device in order to preserve the end-to-end encryption of iMessages, and so that its servers were not involved in actual looking at the images as they were uploaded to iCloud.

The features gained approval from safeguarding groups, including the National Centre for Missing and Exploited Children, which worked with Apple and was to provide the database of abuse imagery that would be scanned through. The Internet Watch Foundation said it was a “vital step to make sure children are kept safe from predators and those who would exploit them online” and said that Apple’s system was a “promising step” both towards protecting privacy and keeping children safe.

But those assurances were not enough to satisfy security and privacy advocates. Edward Snowden said that Apple was “rolling out mass surveillance to the entire world”, and the Electronic Frontier Foundation said the feature could easily be broadened to search for other kinds of material.

“It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses,” it said in a statement shortly after the feature was released.

“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine-learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”

In the wake of that outcry, Apple’s software chief Craig Federighi admitted that the announcement had been “jumbled pretty badly” but said that he and the company were still committed to the underlying technology.

Apple also looked to give more information about how exactly the feature worked, giving assurances including a commitment to be transparent with security researchers and saying that it would set the threshold for CSAM sufficiently high that it did not expect the system to show false positives.

But the opposition continued, and critics continued to call on Apple to drop the feature. In mid-August, a coalition of more than 90 different activist groups wrote an open letter to Apple’s chief executive, Tim Cook, asking him to abandon what it called a plan to “build surveillance capabilities into iPhones, iPads and other Apple products”.

It warned that the feature in iMessages could put young people at risk by flagging images to their parents, noting especially that “LGBTQ+ youths with unsympathetic parents are particularly at risk”.

It also said that once the photo-scanning feature was built, “the company will face enormous pressure, and possibly legal requirements, from governments around the world to scan for all sorts of images that the governments find objectionable”.

Apple said it would resist any attempts by governments to broaden the use of the features, and that they were only planned for use in the US initially.