Advertisement
UK markets close in 1 hour 41 minutes
  • FTSE 100

    8,122.56
    +43.70 (+0.54%)
     
  • FTSE 250

    19,782.78
    +180.80 (+0.92%)
     
  • AIM

    754.95
    +1.83 (+0.24%)
     
  • GBP/EUR

    1.1674
    +0.0018 (+0.15%)
     
  • GBP/USD

    1.2505
    -0.0006 (-0.05%)
     
  • Bitcoin GBP

    51,375.21
    +697.73 (+1.38%)
     
  • CMC Crypto 200

    1,377.10
    -19.44 (-1.39%)
     
  • S&P 500

    5,086.90
    +38.48 (+0.76%)
     
  • DOW

    38,184.09
    +98.29 (+0.26%)
     
  • CRUDE OIL

    84.05
    +0.48 (+0.57%)
     
  • GOLD FUTURES

    2,354.70
    +12.20 (+0.52%)
     
  • NIKKEI 225

    37,934.76
    +306.28 (+0.81%)
     
  • HANG SENG

    17,651.15
    +366.61 (+2.12%)
     
  • DAX

    18,089.77
    +172.49 (+0.96%)
     
  • CAC 40

    8,069.68
    +53.03 (+0.66%)
     

UK cybersecurity chiefs back Apple’s controversial photo-scanning feature

 (Getty Images)
(Getty Images)

The heads of GCHQ and the UK’s National Cybersecurity Centre have said that technology giants should scan users’ phones for illegal images.

Ian Levy, the NCSC’s technical director, and Crispin Robinson, the director of cryptanalysis, claim that a controversial technology called “client-side scanning” could protect children and privacy.

Apple had made moves to introduce such a feature last year, which would detect when people have child sexual abuse material on their devices. The smartphone giant had to ‘indefinitely delay’ its rollout due to pushback from privacy campaigners.

Edward Snowden said that Apple was “rolling out mass surveillance to the entire world”, and the Electronic Frontier Foundation said the feature could easily be broadened to search for other kinds of material.

ADVERTISEMENT

“It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses,” it said.

However, the heads of the UK’s security organisations said that there is “no reason why client-side scanning techniques cannot be implemented safely in many of the situations one will encounter” in a discussion paper published today.

"Child sexual abuse is a societal problem that was not created by the internet and combating it requires an all-of-society response”, they write.

"However, online activity uniquely allows offenders to scale their activities, but also enables entirely new online-only harms, the effects of which are just as catastrophic for the victims."

The pair claimed that criticism of the feature were due to flaws that could be fixed, such as requiring the involvement of multiple child protection organisations and using encryption to ensure that the platform does not have access to the photos – which would only go through child protection groups.

“Details matter when talking about this subject,” Mr Levy and Mr Robinson wrote. “Discussing the subject in generalities, using ambiguous language or hyperbole, will almost certainly lead to the wrong outcome.”

However, Alec Muffett, a cryptography expert who worked on Facebook’s efforts to encrypt its Messenger chatting app, told The Guardian that the paper “entirely ignores the risks of their proposals endangering the privacy of billions of people worldwide” and that it was “weird that they frame abuse as a ‘societal problem’ yet demand only technological solutions for it. Perhaps it would be more effective to use their funding to adopt harm-reduction approaches, hiring more social workers to implement them?”

Apple has already introduced message-scanning for children’s iPhones in the UK to look for images that contain nudity. The tool is referred to by Apple as “expanded protections for children” on iOS, iPadOS, WatchOS and MacOS, but is not turned on by default.