Advertisement
UK markets open in 4 hours 47 minutes
  • NIKKEI 225

    36,979.27
    -1,100.43 (-2.89%)
     
  • HANG SENG

    16,199.61
    -186.26 (-1.14%)
     
  • CRUDE OIL

    85.20
    +2.47 (+2.99%)
     
  • GOLD FUTURES

    2,423.70
    +25.70 (+1.07%)
     
  • DOW

    37,775.38
    +22.07 (+0.06%)
     
  • Bitcoin GBP

    48,952.52
    -911.06 (-1.83%)
     
  • CMC Crypto 200

    1,263.50
    +377.97 (+40.49%)
     
  • NASDAQ Composite

    15,601.50
    -81.87 (-0.52%)
     
  • UK FTSE All Share

    4,290.02
    +17.00 (+0.40%)
     

Apple appeals against security research firm while touting researchers

FILE PHOTO: The Apple Inc. logo is seen hanging at the entrance to the Apple store on 5th Avenue in New York

By Joseph Menn

(Reuters) - Apple Inc on Tuesday appealed a copyright case it lost against security startup Corellium, which helps researchers examine programs like Apple's planned new method for detecting child sex abuse images https://www.reuters.com/technology/after-criticism-apple-only-seek-abuse-images-flagged-multiple-nations-2021-08-13.

A federal judge last year https://www.reuters.com/business/apple-loses-copyright-claims-lawsuit-against-us-security-bug-startup-2020-12-29 rejected Apple's copyright claims against Corellium, which makes a simulated iPhone that researchers use to examine how the tightly restricted devices function.

Security experts are among Corellium's core customers, and the flaws they uncovered have been reported to Apple for cash bounties and used elsewhere, including by the FBI in cracking the phone of a mass shooter who killed several people in San Bernardino, California.

ADVERTISEMENT

Apple makes its software hard to examine, and the specialised research phones it offers to pre-selected experts come with a host of restrictions. The company declined to comment.

The appeal came as a surprise because Apple had just settled other claims with Corellium relating to the Digitial Milennium Copyright Act, avoiding a trial.

Experts said they were also surprised that Apple revived a fight against a major research tool provider just after arguing that researchers would provide a check on its controversial plan to scan customer devices.

"Enough is enough," said Corellium Chief Executive Amanda Gorton. "Apple can't pretend to hold itself accountable to the security research community while simultaneously trying to make that research illegal."

Under Apple's plan announced earlier this month, software will automatically check photos slated for upload from phones or computers to iCloud online storage to see if they match digital identifiers of known child abuse images. If enough matches are found, Apple employees will look to make sure the images are illegal, then cancel the account and refer the user to law enforcement.

"We'll prevent abuse of these child safety mechanisms by relying on people bypassing our copy protection mechanisms,' is a pretty internally incoherent argument," tweeted David Thiel of the Stanford Internet Observatory.

Because Apple has marketed itself as devoted to user privacy and other companies only scan content after it is stored online or shared, digital rights groups have objected to the plan.

One of their main arguments has been that governments theoretically could force Apple to scan for prohibited political material as well, or to target a single user.

In defending the program, Apple executives said researchers could verify the list of banned images and examine what data was sent to the company in order to keep it honest about what it was seeking and from whom.

One executive said that such reviews made it better for privacy overall than would have been possible if the scanning occurred in Apple's storage, where it keep the coding secret.

(Reporting by Joseph Menn and Stephen Nellis in San Francisco; Editing by Rosalba O'Brien and Sandra Maler)