Advertisement

WSJ report uncovers 'dark' content on Instagram Reels

A new report by Wall Street Journal reveals an unsettling trend on Instagram Reels feature with the app's algorithm pushing "salacious content" tied to pre-teen users, appearing alongside well-known brands’ advertisements. Wall Street Journal Reporter Jeff Horowitz — one of the journalists behind the story — joins Yahoo Finance Live to reveal disturbing details about Reels and the way its algorithm appears to actually work.

Horowitz discovered “dark” content that included known victims of exploitation being used as advertisements for other websites and apps. Horowitz states “the algorithm is adapting to the network”, noting that it’s “very easy” for Instagram to pick up on preferences from burner accounts and appears to adjust quickly. According to the report, Instagram staff were allegedly made aware of this previously.

For more expert insight and the latest market action, click here to watch this full episode of Yahoo Finance Live.

Video transcript

RACHELLE AKUFFO: The idea that these sort of results came up after just a day of this content. How does that speak to how the algorithm is moving and adapting to some of this harmful content, given that Meta is saying, look, we're taking some of these things down as fast as we can. But considering that this just happened in a day, how do you-- how do you marry the two?

ADVERTISEMENT

JEFF HORWITZ: So the algorithm isn't really adapting to the content. The algorithm is adapting to the network, right? And one thing that research has demonstrated is that niche communities have a really strong algorithmic pull. So essentially, a lot of the users who were consuming this sort of stuff happened to be-- the burner accounts, right? People don't follow child sexualization content from their personal email and Instagram account.

So it's very, actually, very easy for the algorithm to determine that as soon as an account kind of touches this network, that it probably wants the same thing that Instagram serving to that network, like, right away. And so it's a very easy signal, unfortunately. In other words, like, there's a lot of reason to believe that this is going to be a stronger recommendation set than it would be if you were into cars or football.

AKIKO FUJITA: Jeff, your report points to the fact that this was known among some employees within the company, that has been raised as well. How did the company respond internally? What did you hear?

JEFF HORWITZ: So this was I think a concern in particular about recommending videos of kids. A safety staffer who spoke with us told us-- actually, a couple of them did told us that this had been raised before the product's launch. And that perhaps the suggestion was maybe they didn't want to recommend videos of kids because they weren't going to be able to stop this. And they obviously did not follow that advice.

And so this has been-- I mean, I think it's a kind of a known thing that recommender systems, algorithms like what reels do will kind of personalize content without regard to whether that content should be getting personalized, which in the instance of people who want to consume videos of kids dancing in their underwear. Arguably, you don't want to be providing that service. It's not even arguably.

But they have been. And clearly haven't been able to stop it. We approached them in August about this to let them know it was happening. We approached advertisers in October. We're talking Disney, Walmart, Pizza Hut, a whole bunch of brand name, blue chip entities. And as per the Canadian Center it still has not stopped.