Advertisement
UK markets closed
  • FTSE 100

    8,433.76
    +52.41 (+0.63%)
     
  • FTSE 250

    20,645.38
    +114.08 (+0.56%)
     
  • AIM

    789.87
    +6.17 (+0.79%)
     
  • GBP/EUR

    1.1622
    +0.0011 (+0.09%)
     
  • GBP/USD

    1.2525
    +0.0001 (+0.01%)
     
  • Bitcoin GBP

    48,582.45
    -1,684.57 (-3.35%)
     
  • CMC Crypto 200

    1,259.02
    -98.99 (-7.29%)
     
  • S&P 500

    5,222.68
    +8.60 (+0.16%)
     
  • DOW

    39,512.84
    +125.08 (+0.32%)
     
  • CRUDE OIL

    78.20
    -1.06 (-1.34%)
     
  • GOLD FUTURES

    2,366.90
    +26.60 (+1.14%)
     
  • NIKKEI 225

    38,229.11
    +155.13 (+0.41%)
     
  • HANG SENG

    18,963.68
    +425.87 (+2.30%)
     
  • DAX

    18,772.85
    +86.25 (+0.46%)
     
  • CAC 40

    8,219.14
    +31.49 (+0.38%)
     

How YouTube led me down a wormhole of neo-Nazi conspiracy theories

A love of folk music quickly led to one YouTuber being offered videos of neo-Nazis. - Getty Images Contributor
A love of folk music quickly led to one YouTuber being offered videos of neo-Nazis. - Getty Images Contributor

How long does it take for supremacists to target you on YouTube? For one user, it was immediate. “I once clicked on some old German folk songs, and was then fed folk song videos of neo-Nazis,” the user explains.

“I did not regret having clicked on the politically correct folk songs... but rather that YouTube apparently thinks that those who listen to such songs are highly likely to be considered close to National Socialism.”

This is one of 28 crowdsourced stories from people who claim their use of YouTube backfired, gathered by search engine Mozilla this week.

All of them argue the same thing: YouTube's algorithm is still broken and it causes users to fall down disturbing wormholes.

ADVERTISEMENT

The issues are wide-ranging. According to the report, a horse-riding enthusiast was bombarded with animal porn.

In a separate incident an eight year old girl was shown rotting skin and disfiguration after watching inspirational videos.

Meanwhile, a documentary about cod liver oil morphed into videos promoting a government conspiracy that entailed controlling people’s minds through the medium of fluoridated water.

The technology that YouTube uses to decide what people want to see on its site have caused executives many headaches in the past. Google's sister company has been accused of promoting far-Right conspiracy theorists, rewarding extremism and refusing to take responsibility for the toxic content that can appear on its pages.

Any user could stumble across offensive content

Recently, even one of Google’s former directors took aim at the company after his daughter was “recommended” an explicit video masquerading as a popular children’s fairy tale.

Yet still, users complain that their searches lead them down a winding path towards offensive and inappropriate videos.

“One time I was looking for history videos about the Trojan War. There is an event where Achilles gets shot with an arrow in his heel, and that is where the word ‘Achilles tendon’ comes from,” a YouTuber says.

“Ever since, I get recommendations about high heels, videos of girls showing how to walk in heels, or fetish videos of girls walking in high heels — even stripper heels! All that because I love ancient history! Not cool.”

Not even the youngest viewers are spared. YouTube's terms and conditions set its minimum age at 13 and it has a separate section, YouTube Kids, for children, but content aimed at very young children, such as cartoons, nursery rhymes and "unboxing" videos, are very popular on the main site. Parents are often not fully aware of the dangers of not restricting their children's viewing habits - with dire consequences.

"When my son was preschool age, he liked to watch 'Thomas the Tank Engine' videos on YouTube," says one parent. "One time when I checked on him, he was watching a video compilation that contained graphic depictions of train wrecks."

In fact, children using the main site have become targets for pedophiles, who use the videos they post of themselves online to share inappropriate content and comments with each other.

Earlier this year a YouTuber exposed disturbing videos featuring young children, which were viewed millions of times. Comments posted below the videos included messages such as  "I love you", "Wow, so beautiful girls", "You're pretty and your girlfriends are nice too. Let's be friends", and "I would love to see more".

Brands including Nestle, AT&T and Epic Games, the company behind popular videogame Fortnite, pulled their advertising on the site over concerns that their brands were appearing alongside the videos.

YouTube blocked comments on tens of millions of videos featuring children which could  be "at risk of attracting predatory behaviour" in response to this.

UFOs and anti-LGBT hatred

Until now, YouTube has had a rather reactive approach to bad videos surfacing on its site. It has tried blocking comments on potentially harmful content, added ‘fact-checking’ boxes on topics prone to hoaxes and performed sweeping purges on specific content.

Yet other decisions it has made have been questioned. The site came under fire for refusing to remove conspiracy theory videos questioning whether the 9/11 terror attacks happened because they were not deemed “harmful”.

Some users claim that conspiracy theories on the site are being linked to serious content in confusing and inappropriate ways.

"I'm a teacher and I watched serious documentaries about Apollo 11. But YouTube's recommendations are now full of videos about conspiracy theories: about 9/11, Hitler's escape, alien seekers and anti-American propaganda," says one user.

Another user said that his father "thought aliens were already living among us in disguise", that he could eliminate his heating bill with a new free energy device, and that UFOs were all over the place. "He showed me YouTube videos that "proved it."

The video-streaming site has been accused of curbing the visibility of LGBT content creators and their videos, labeling them "sensitive" or "mature" and restricting them from appearing in search results or recommendation and doing little to filter the harassment and hate speech in the comments section. A group of LGBT video-makers have filed a lawsuit against the company for discrimination.

One user claimed that searches for positive LGBT content results in "a barrage of homophobic, right-wing recommendations" that could be harmful to people still figuring out their identity.

"I used to occasionally watch a drag queen who did a lot of positive affirmation/confidence building videos and vlogs," says another YouTuber. "My recommendations and the sidebar were full of anti-LGBT and similar hateful content. It got to the point where I stopped watching their content and still regretted it, as the recommendations followed me for ages after."

But YouTube is not the only technology company afflicted by a worrying amount of inappropriate content. Earlier this year, The Telegraph uncovered a battle of conspiracies and scams on Instagram, which can link users from friends’ holiday photos to disturbing footage and fake information within seconds.

Charities have warned that children as young as eight are at risk of being targeted on popular video streaming sites such as TikTok in the live comments function of the site.

Meanwhile Facebook's plans to encrypt messaging will put children and vulnerable people at risk, according to the NSPCC, because it will be virtually impossible to track inappropriate content being shared online.

YouTube is among the main companies targeted by a wide-reaching FTC investigation and follow-on legislation, which could trigger some major changes in the algorithms used to monitor, surface and recommend content. But for the sake of everyone using the site, new rules can't come fast enough.