UK markets closed
  • FTSE 100

    +26.32 (+0.37%)
  • FTSE 250

    +123.85 (+0.54%)
  • AIM

    +4.02 (+0.33%)

    +0.0058 (+0.49%)

    +0.0066 (+0.48%)

    +3,017.55 (+7.17%)
  • CMC Crypto 200

    +50.86 (+3.62%)
  • S&P 500

    +33.11 (+0.75%)
  • DOW

    +382.20 (+1.09%)

    +1.05 (+1.29%)

    -29.90 (-1.66%)
  • NIKKEI 225

    +517.70 (+1.81%)

    +368.37 (+1.48%)
  • DAX

    +124.64 (+0.81%)
  • CAC 40

    +42.31 (+0.63%)

‘Instagram Kids’ app stopped amid body image and ‘addiction’ controversy

·2-min read
Instagram has defended shelved plans to launch a version aimed at ‘tween’ users  (Associated Press)
Instagram has defended shelved plans to launch a version aimed at ‘tween’ users (Associated Press)

Instagram has paused its controversial “Instagram Kids” app, which would have allowed children as young as 10 to join the platform.

“We’re pausing our project to build an Instagram experience for tweens,” Instagram head Adam Mosseri tweeted, saying the app would be for children aged between 10 and 12 years-old, with parental oversight.

However, after “the project leaked way before we knew what it would be”, Mr Mosseri says, “people feared the worst, and we had few answers at that stage.”

In the United States, 44 states had asked the company to drop the plans, saying that “use of social media can be detrimental to the health and wellbeing of children, who are not equipped to navigate the challenges of having a social media account” and that there were “myriad other – and safer – ways for young children to connect with family and friends”.

Mr Mosseri added that Facebook, which owns Instagram, will continue to work on “why this project is valuable”. He continued that this does not “change the status quo” of children using smartphones, lying about their age, and creating accounts on Instagram and other apps such as YouTube and TikTok.

“I have to believe parents would prefer the option for their children to use an age-appropriate version of Instagram – that gives them oversight – than the alternative. But I’m not here to downplay their concerns, we have to get this right,” Mr Mosseri wrote.

The announcement follows a series of reports based on internal Facebook research that Instagram knew its app was making teenage girls feel worse about their bodies and that they often feel ‘addicted’. In 2019, teenager Molly Russell committed suicide, with her father accusing the app of “helping to kill” his daughter. Instagram said it would ban graphic images of self-harm from the app one week later.

“Teens told us that they don’t like the amount of time they spend on the app but feel like they have to be present,” Facebook’s internal documents allegedly said regarding Instagram use. “They often feel ‘addicted’ and know that what they’re seeing is bad for their mental health but feel unable to stop themselves.” It was also found that selfies that have been filtered and shared in stories made users feel worse.

In a blog post about the findings, Instagram said: “Social media isn’t inherently good or bad for people. Many find it helpful one day, and problematic the next. What seems to matter most is how people use social media, and their state of mind when they use it.” In a blog post earlier on Monday, the company pushed back against media criticism, arguing that teens struggling with loneliness, anxiety, sadness and eating issues felt better when using Instagram.

Neither Facebook nor the Wall Street Journal, who originally reported the damning research, has released the data that their claims are based on, making it difficult for external academics and journalists to examine them.

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting