Advertisement
UK markets closed
  • NIKKEI 225

    38,405.66
    +470.90 (+1.24%)
     
  • HANG SENG

    17,763.03
    +16.12 (+0.09%)
     
  • CRUDE OIL

    81.54
    -1.09 (-1.32%)
     
  • GOLD FUTURES

    2,297.00
    -60.70 (-2.57%)
     
  • DOW

    37,815.92
    -570.17 (-1.49%)
     
  • Bitcoin GBP

    47,919.47
    -2,475.77 (-4.91%)
     
  • CMC Crypto 200

    1,281.38
    -57.69 (-4.31%)
     
  • NASDAQ Composite

    15,657.82
    -325.26 (-2.04%)
     
  • UK FTSE All Share

    4,430.25
    -4.93 (-0.11%)
     

How TikTok bombards young men with misogynistic videos

An Observer investigation has revealed how TikTok is promoting misogynistic content to young people despite claiming to ban it.

Videos of the online personality Andrew Tate, who has been criticised by domestic abuse campaigners for normalising extreme and outdated views about women, are among those pushed by the algorithm to users via the curated For You homepage.

We conducted an experiment to get an insight into what young people are being shown on the platform, which allows users to join from the age of 13.

To ensure the findings wouldn’t be influenced by our previous search history, we set up a new TikTok account for an imaginary teenager, using a fake name and date of birth.

At first, the 18-year-old’s account was shown a mixture of material including comedy clips, dog videos and discussions about men’s mental health.

But after watching videos aimed at male users – including a clip from the Alpha Blokes podcast and a clip of a TikTokker discussing how men “don’t talk about their feelings” – the algorithm began suggesting more content that appeared to be tailored for men.

Without “liking” or searching for any content proactively, the suggestions included videos of Andrew Tate, including one from a copycat account using Tate’s name and picture captioned the “harsh reality of men”, which appeared to blame feminism for making men miserable, adding that the “majority of men have no money, no power, no sex from their wife”, and that their lives “suck”.

After watching two of his videos we were recommended more, including clips of him expressing misogynistic views. The next time the account was opened, the first four posts were by Tate, from four different accounts.

The algorithm also suggested videos from Dr Jordan Peterson, a Canadian psychologist known for his rightwing views; men’s coaching programmes and videos from men’s rights activists.

But the Tate content was by far the most widespread. When opening the app again a week later, the account was again flooded with Tate content, with eight out of the first 20 videos being of Tate.

The clips included a video where he says most men’s lives suck because they have “no power” and “no sex from their wife”, and another where he describes his girlfriend as “very well trained”.

In another, he says people seeking mental health support are “useless”. He says: “If you’re the kind of person who feels like you need therapy, you need someone to talk to, do you know what you are? You’re useless. Because in the harshest realities of this cold world there are people in Syria whose entire families have been blown to fuck with a bomb from the sky.”

Another video recommended by the algorithm derided people for wearing masks during the pandemic, saying they were either “idiots or cowards”, while claiming that by choosing not to wear one, he showed “bravery and balls”.

Experts have raised concerns about the spread of content featuring Tate on the platform, where videos of him have been watched 11.6 billion times.

Callum Hood, head of research at the Center for Countering Digital Hate, said: “The dangerous thing is that it is very eye-catching content, and the TikTok algorithm in particular is so aggressive that you only need to pause for a few moments before it will begin to recommend similar content to you again and again.”

TikTok said: “Misogyny and other hateful ideologies and behaviours are not tolerated on TikTok, and we are working to review this content and take action against violations of our guidelines. We continually look to strengthen our policies and enforcement strategies, including adding more safeguards to our recommendation system.”