UK markets closed
  • FTSE 100

    6,963.64
    -63.84 (-0.91%)
     
  • FTSE 250

    23,658.94
    +26.10 (+0.11%)
     
  • AIM

    1,275.74
    +3.60 (+0.28%)
     
  • GBP/EUR

    1.1711
    -0.0007 (-0.06%)
     
  • GBP/USD

    1.3737
    -0.0059 (-0.43%)
     
  • BTC-GBP

    35,178.83
    +53.45 (+0.15%)
     
  • CMC Crypto 200

    1,193.48
    -32.05 (-2.62%)
     
  • S&P 500

    4,432.99
    -40.76 (-0.91%)
     
  • DOW

    34,584.88
    -166.44 (-0.48%)
     
  • CRUDE OIL

    71.96
    -0.65 (-0.90%)
     
  • GOLD FUTURES

    1,753.90
    -2.80 (-0.16%)
     
  • NIKKEI 225

    30,500.05
    +176.71 (+0.58%)
     
  • HANG SENG

    24,920.76
    +252.91 (+1.03%)
     
  • DAX

    15,490.17
    -161.58 (-1.03%)
     
  • CAC 40

    6,570.19
    -52.40 (-0.79%)
     

Apple delays launch of child abuse detection tools

·3-min read
Apple is delaying the rollout (Edmond Terakopian/PA) (PA Wire)
Apple is delaying the rollout (Edmond Terakopian/PA) (PA Wire)

Apple is to delay the launch of new tools designed to detect child sexual abuse material (CSAM), saying it wants to take more time to “make improvements” after privacy concerns were raised.

The iPhone maker had announced plans to introduce new systems which would detect child sexual abuse imagery when someone tried to upload it to iCloud, and report it to authorities.

Apple said the process would be done securely and would not regularly scan a user’s camera roll, however, privacy campaigners raised concerns over the plans, with some suggesting the technology could be hijacked by authoritarian governments to look for other types of imagery – something Apple said it would not allow.

But the tech giant has now confirmed it is delaying the rollout following feedback from a number of groups.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” the company said in a statement.

Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features

Apple

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

The system works by looking for image matches based on a database of “hashes” – a type of digital fingerprint – of known CSAM images provided by child safety organisations.

This process takes place securely on a device when a user attempts to upload images to their iCloud photo library.

It was to be joined by another new feature in the Messages app, which warns children and their parents using linked family accounts when sexually explicit photos are sent or received, with images blocked from view and on-screen alerts; and new guidance in Siri and Search which will point users to helpful resources when they perform searches related to CSAM.

We hope Apple will consider standing their ground instead of delaying important child protection measures in the face of criticism

Andy Burrows, NSPCC

Apple said the two features are not the same and do not use the same technology, adding that it will “never” gain access to communications as a result of the improvements to Messages.

Andy Burrows, head of child safety online policy at children’s charity the NSPCC said the delay was “incredibly disappointing”.

“Apple were on track to roll out really significant technological solutions that would undeniably make a big difference in keeping children safe from abuse online and could have set an industry standard,” he said.

“They sought to adopt a proportionate approach that scanned for child abuse images in a privacy-preserving way, and that balanced user safety and privacy.

“We hope Apple will consider standing their ground instead of delaying important child protection measures in the face of criticism.”

Read More

Apple loosens App Store rules allowing developers to link to external websites

Government to consult on new rules for ‘harmful’ content on streaming platforms

Apple eases rules allowing app makers to email users about payment alternatives

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting