Advertisement
UK markets closed
  • NIKKEI 225

    38,835.10
    +599.03 (+1.57%)
     
  • HANG SENG

    18,479.37
    -98.93 (-0.53%)
     
  • CRUDE OIL

    78.37
    -0.01 (-0.01%)
     
  • GOLD FUTURES

    2,322.50
    -1.70 (-0.07%)
     
  • DOW

    38,884.26
    +31.99 (+0.08%)
     
  • Bitcoin GBP

    50,422.94
    -387.18 (-0.76%)
     
  • CMC Crypto 200

    1,308.29
    -56.83 (-4.16%)
     
  • NASDAQ Composite

    16,332.56
    -16.69 (-0.10%)
     
  • UK FTSE All Share

    4,522.99
    +53.90 (+1.21%)
     

Watchdog cracks down on tech firms that fail to protect children

<span>Photograph: Dan Peled/AAP</span>
Photograph: Dan Peled/AAP

Technology companies will be required to assess their sites for sexual abuse risks, prevent self-harm and pro-suicide content, and block children from broadcasting their location, after the publication of new rules for “age-appropriate design” in the sector.

The UK Information Commissioner’s Office, which was tasked with creating regulations to protect children online, will enforce the new rules from autumn 2021, after one-year transition period. After which companies that break the law can face sanctions comparable to those under GDPR, including fines of up to £17m or 4% of global turnover.

Related: Think twice before you share our faces online, say children

ADVERTISEMENT

Companies that make services likely to be accessed by a child will have to take account of 15 principles designed to ensure their services do not cause harm by default. Those include:

  • a requirement to default privacy settings to high, unless there is a compelling reason not to;

  • orders to switch off geolocation by default, and to turn off visible location tracking at the end of every session;

  • a block on using “nudge techniques to lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections”;

  • a requirement on sites to uphold their stated terms, policies and community standards.

Elizabeth Denham, the information commissioner, said: “Personal data often drives the content that our children are exposed to – what they like, what they search for, when they log on and off and even how they are feeling.

“In an age when children learn how to use an iPad before they ride a bike, it is right that organisations designing and developing online services do so with the best interests of children in mind. Children’s privacy must not be traded in the chase for profit.”

Andy Burrows, the NSPCC’s head of child safety online policy, said: “This transformative code will force high-risk social networks to finally take online harm seriously and they will suffer tough consequences if they fail to do so.

“For the first time, tech firms will be legally required to assess their sites for sexual abuse risks, and can no longer serve up harmful self-harm and pro-suicide content. It is now key that these measures are enforced in a proportionate and targeted way.”

The code, which is legally backed by a requirement in the Data Protection Act 2018 for the ICO to prepare a code of practice containing guidance “on standards of age-appropriate design of relevant information society services which are likely to be accessed by children” had faced criticism in earlier drafts over the risk it could force the entire internet to be made child safe, owing to ambiguity over whether any given site is likely to be accessed by children.

In the final version of the code, the ICO says it will take a “commonsense” approach to the question, but notes that “if your service is the kind of service that you would not want children to use in any case, then your focus should be on how you prevent access.

“If your service is not aimed at children but is not inappropriate for them to use either, then your focus should be on assessing how appealing your service will be to them.”

The initial focus of the code is likely to be large social media companies, including YouTube, TikTok and Snapchat, all of which have significant numbers of child users and, until now, few legal restrictions on how they can be treated.

But the changes have not eased all fears. Dom Hallas, the co-founder of Coadec, the UK’s coalition of tech startups, called the rules a “textbook example of bad regulation that will entrench big companies”.

“The practical impact of the code will be that thousands of tech companies, from e-commerce to maps, have to build multiple versions of the same product with different sets of rules. Startups can’t afford to do this but big tech can.

“Many will say this code is a victory for kids but it will in fact restrict the startup services available to under-18s and create an internet for children designed by tech giants.”

In the US, under-13s must be given special treatment, which has led to trouble for YouTube after a Federal Trade Commission settlement last year.