Advertisement
UK markets open in 5 hours 1 minute
  • NIKKEI 225

    36,880.53
    -1,199.17 (-3.15%)
     
  • HANG SENG

    16,091.68
    -294.19 (-1.80%)
     
  • CRUDE OIL

    85.76
    +3.03 (+3.66%)
     
  • GOLD FUTURES

    2,431.20
    +33.20 (+1.38%)
     
  • DOW

    37,775.38
    +22.07 (+0.06%)
     
  • Bitcoin GBP

    49,291.80
    -349.36 (-0.70%)
     
  • CMC Crypto 200

    1,263.36
    +377.82 (+40.47%)
     
  • NASDAQ Composite

    15,601.50
    -81.87 (-0.52%)
     
  • UK FTSE All Share

    4,290.02
    +17.00 (+0.40%)
     

Child abuse: Apple urged to roll out image-scanning tool swiftly

Child protection experts from across the world have called on Apple to implement new scanning technologies urgently to detect images of child abuse.

In August, Apple announced plans to use a tool called neuralMatch to scan photos being uploaded to iCloud online storage and compare them to a database of known images of child abuse.

However, the tech company has since said it is pausing the rollout after heavy lobbying from privacy campaigners who raised concerns over the potential misuse of neuralMatch by governments that they claim could use the tool to increase surveillance of private citizens.

Ross Anderson, a professor of security engineering at Cambridge University and Edinburgh University, wrote: “Child protection online is an urgent problem, but this proposal will do little to prevent these appalling crimes, while opening the floodgates to a significant expansion of the surveillance state.”

ADVERTISEMENT

This week, child protection agencies, including the NSPCC, the National Center for Missing and Exploited Children (NCMEC) and the UN special rapporteur on the sale and sexual exploitation of children, released a joint statement endorsing neuralMatch and saying that “time is of the essence” to use new technology to help protect children from online exploitation and abuse.

“Concerns that such technology is a ‘slippery slope’ towards surveillance remain hypothetical and do not justify rejecting an opportunity for progress that would enable the many thousands of victims and survivors of sexual abuse who have their images circulated online to be protected from revictimisation and retraumatisation,” the groups said in the statement. “Instead, we should work together to ensure appropriate safeguards, checks and balances are in place.”

Recirculated images of abuse are one of the major challenges for law enforcement and child protection agencies globally. Police figures show that the UK database of known child abuse images has 17m unique entries on it, and is growing by 500,000 images every two months.

Scanning technologies aim to constantly analyse and data match these images – using a technique called hashing that identifies particular images – so that when they are shared online this can be used to detect and arrest offenders. Children’s abuse can be permanently removed from the internet, stopping an endless cycle of revictimisation, its advocates argue.

Apple said it would be looking for known images. If a strong enough match is flagged by the scanning technology, then staff will manually review the reported images, and, if child abuse is confirmed, the user’s account will be disabled and the NCMEC notified.

Related: 'It’s an arms race’: the tech teams trying to outpace paedophiles online

Apple software chief, Craig Federighi, told the Wall Street Journal that he believed the technology has been misunderstood. He stressed that the tools could only look to match known images of child abuse – not general images of children.

Iain Drennan, executive director at WeProtect Global Alliance, an organisation tackling child sexual exploitation and abuse online, said: “Balancing privacy and child protection is not simple, and so it was hugely encouraging to see Apple recognise and respond to this challenge.

“We owe it to victims and survivors to identify and remove the records of their sexual abuse as swiftly as we possibly can,” he said.