UK markets closed
  • FTSE 100

    6,963.64
    -63.84 (-0.91%)
     
  • FTSE 250

    23,658.94
    +26.10 (+0.11%)
     
  • AIM

    1,275.74
    +3.60 (+0.28%)
     
  • GBP/EUR

    1.1711
    -0.0007 (-0.06%)
     
  • GBP/USD

    1.3737
    -0.0059 (-0.43%)
     
  • BTC-GBP

    34,985.88
    -538.67 (-1.52%)
     
  • CMC Crypto 200

    1,193.48
    -32.05 (-2.62%)
     
  • S&P 500

    4,432.99
    -40.76 (-0.91%)
     
  • DOW

    34,584.88
    -166.44 (-0.48%)
     
  • CRUDE OIL

    71.96
    -0.65 (-0.90%)
     
  • GOLD FUTURES

    1,753.90
    -2.80 (-0.16%)
     
  • NIKKEI 225

    30,500.05
    +176.71 (+0.58%)
     
  • HANG SENG

    24,920.76
    +252.91 (+1.03%)
     
  • DAX

    15,490.17
    -161.58 (-1.03%)
     
  • CAC 40

    6,570.19
    -52.40 (-0.79%)
     

More than 90 policy groups call for Apple to abandon plan to scan images on iPhones

  • Oops!
    Something went wrong.
    Please try again later.
·4-min read
In this article:
  • Oops!
    Something went wrong.
    Please try again later.
 (AFP via Getty Images)
(AFP via Getty Images)

Apple has been urged to drop its plans to scan iPhones for child sexual abuse material, in a new open letter signed by more than 90 policy and rights groups.

Earlier this month, the company announced that it would add a feature to the iPhone that scanned images for known child sexual abuse material, or CSAM, when they were uploaded to its servers, and alert the company if it was found. It also said that phones would use artificial intelligence to spot when children were exchanging pictures that appeared to feature nudity, and alert their parents.

Apple has said that the feature is required in the face of the vast amount of abuse imagery that circulates online. It has said it is built with privacy in mind, including tools that ensure the analysis happens on users’ phones rather than in the cloud.

But privacy activists and security experts have raised concerns that the feature could be misused. It could be used to scan for other kinds of images, experts have suggested, alongside other potential problems.

Those issues were highlighted in the open letter, which asks Apple to stop its plans to implement the feature in an upcoming update to the operating system powering the iPhone and iPad.

“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the groups wrote in the letter, which was first reported by Reuters.

The largest campaign to date over an encryption issue at a single company was organized by the U.S.-based nonprofit Center for Democracy & Technology (CDT).

Some overseas signatories in particular are worried about the impact of the changes in nations with different legal systems, including some already hosting heated fights over encryption and privacy.

“It’s so disappointing and upsetting that Apple is doing this, because they have been a staunch ally in defending encryption in the past,” said Sharon Bradford Franklin, co-director of CDT’s Security & Surveillance Project.

An Apple spokesman said the company had addressed privacy and security concerns in a document Friday outlining why the complex architecture of the scanning software should resist attempts to subvert it.

Those signing included multiple groups in Brazil, where courts have repeatedly blocked Facebook’s WhatsApp for failing to decrypt messages in criminal probes, and the senate has passed a bill that would require traceability of messages, which would require somehow marking their content. A similar law was passed in India this year.

“€œOur main concern is the consequence of this mechanism, how this could be extended to other situations and other companies,”€ said Flavio Wagner, president of the independent Brazil chapter of the Internet Society, which signed. “€œThis represents a serious weakening of encryption.—

Other signers were in India, Mexico, Germany, Argentina, Ghana and Tanzania.

Surprised by the earlier outcry following its announcement two weeks ago, Apple has offered a series of explanations and documents to argue that the risks of false detections are low.

Apple said it would refuse demands to expand the image-detection system beyond pictures of children flagged by clearinghouses in multiple jurisdictions, though it has not said it would pull out of a market rather than obeying a court order.

Though most of the objections so far have been over device-scanning, the coalition’s letter also faults a change to iMessage in family accounts, which would try to identify and blur nudity in children’s messages, letting them view it only if parents are notified.

The signers said the step could endanger children in intolerant homes or those seeking educational material. More broadly, they said the change will break end-to-end encryption for iMessage, which Apple has staunchly defended in other contexts.

“Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit,” the letter says.

Other groups that signed include the American Civil Liberties Union, Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.

Additional reporting by Reuters

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting