Advertisement
UK markets open in 6 hours 46 minutes
  • NIKKEI 225

    38,460.08
    0.00 (0.00%)
     
  • HANG SENG

    17,201.27
    +372.34 (+2.21%)
     
  • CRUDE OIL

    82.72
    -0.09 (-0.11%)
     
  • GOLD FUTURES

    2,328.40
    -10.00 (-0.43%)
     
  • DOW

    38,460.92
    -42.77 (-0.11%)
     
  • Bitcoin GBP

    51,631.38
    -1,814.73 (-3.40%)
     
  • CMC Crypto 200

    1,388.72
    -35.38 (-2.48%)
     
  • NASDAQ Composite

    15,712.75
    +16.11 (+0.10%)
     
  • UK FTSE All Share

    4,374.06
    -4.69 (-0.11%)
     

Apple Will Scan iCloud Photos for Child Sexual Abuse Images and Report Matches to Legal Authorities

Apple is adding a series of new child-safety features to its next big operating system updates for iPhone and iPad.

As part of iOS 15 and iPadOS 15 updates later this year, the tech giant will implement a feature to detect photos stored in iCloud Photos that depict sexually explicit activities involving children.

More from Variety

“This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC),” the company said in a notice on its website. NCMEC acts as a reporting center for child sexual abuse material (CSAM) and works in collaboration with law enforcement agencies across the U.S.

ADVERTISEMENT

According to Apple, its method of detecting known CSAM is “designed with user privacy in mind.” The company says it is not directly accessing customers’ photos but instead is using a device-local, hash-based matching system to detect child abuse images. Apple says it can’t actually see user photos or the results of such scans unless there’s a hit.

If there’s a match between a user’s photos and the CSAM database, Apple then manually reviews each report to confirm the presence of sexually explicit images of children, then will disable the user’s account and send a report to NCMEC. If a user feels their account has been mistakenly flagged, according to Apple, “they can file an appeal to have their account reinstated.” The system provides a high level of accuracy that ensures less than a one-in-1-trillion chance per year of incorrectly flagging a given account, according to Apple.

Apple posted a 12-page technical summary of its CSAM detection system at this link.

In addition, with Apple’s iOS 15 update, the iPhone’s Messages app will add new tools to warn children and their parents if they are receiving or sending sexually explicit photos.

“When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo,” Apple said. “As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos.”

Apple’s iOS 15 also will provide updates to Siri and Search to “provide parents and children expanded information and help if they encounter unsafe situations.” Siri and Search will intervene when users try to search for child sexual abuse material, displaying prompts that will “explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.”

The iOS 15 update is slated to be available in the fall of 2021, available for iPhone 6s and later models.

Best of Variety

Sign up for Variety’s Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.