Advertisement
UK markets closed
  • FTSE 100

    7,895.85
    +18.80 (+0.24%)
     
  • FTSE 250

    19,391.30
    -59.37 (-0.31%)
     
  • AIM

    745.67
    +0.38 (+0.05%)
     
  • GBP/EUR

    1.1607
    -0.0076 (-0.65%)
     
  • GBP/USD

    1.2370
    -0.0068 (-0.55%)
     
  • Bitcoin GBP

    51,993.17
    +742.66 (+1.45%)
     
  • CMC Crypto 200

    1,387.91
    +75.29 (+5.74%)
     
  • S&P 500

    4,967.23
    -43.89 (-0.88%)
     
  • DOW

    37,986.40
    +211.02 (+0.56%)
     
  • CRUDE OIL

    83.24
    +0.51 (+0.62%)
     
  • GOLD FUTURES

    2,406.70
    +8.70 (+0.36%)
     
  • NIKKEI 225

    37,068.35
    -1,011.35 (-2.66%)
     
  • HANG SENG

    16,224.14
    -161.73 (-0.99%)
     
  • DAX

    17,737.36
    -100.04 (-0.56%)
     
  • CAC 40

    8,022.41
    -0.85 (-0.01%)
     

TikTok’s algorithm misreads creator’s pro-Black profile as a threat

 (Getty Images)
(Getty Images)

TikTok blocked users of its Creator Marketplace from using the word “black” and phrases like “Black Lives Matter” in their bios, as the algorithm flagged them as “inappropriate content”.

Creator Ziggi Tyler discovered the issue attempting to update his bio; the words “Black,” “Black Lives Matter,” “Black people," “Black success,” “Pro-Black,” and “I am a Black man” were not accepted. “Pro-white” and “supporting white supremacy” were accepted by TikTok’s algorithms without issue.

TikTok’s Creator Marketplace is currently in invite-only beta testing, but aims to connect creators with brands for sponsorship deals.

ADVERTISEMENT

TikTok said that the app mistakenly flagged phrases because its hate speech detector associated the words “black” and “audience” – which contains the word “die”.

“Our TikTok Creator Marketplace protections, which flag phrases typically associated with hate speech, were erroneously set to flag phrases without respect to word order,” a TikTok spokesperson said in a statement.

“We recognize and apologize for how frustrating this was to experience, and our team has fixed this significant error. To be clear, Black Lives Matter does not violate our policies and currently has over 27B views on our platform."

The issue is the latest in a series of examples of automated systems working against minorities. Instagram’s CEO Adam Mosseri said in June 2020 that the company needed to better support the black community, and is looking into how its “policies, tools, and processes impact black people”, including its own algorithmic bias.

Algorithmic censorship also saw posts from Palestinians about violence in Gaza taken down on Facebook, Instagram, and Twitter, and led to criticism over the black-box nature of these systems.

Outside of social media other algorithms, including facial recognition algorithms, routinely fail to properly identify the faces of people of colour – who are already targeted disproportionately by police. In February 2019, Nijeer Parks spent 10 days in jail and paid $5000 (£3627) to defend himself after being misidentified by facial recognition software and subsequently arrested by police.

“Regardless of what the algorithm is and how it picked up, somebody had to program that algorithm,” Tyler told Recode. “And if [the problem] is the algorithm, and the marketplace has been available since [2020], why wasn’t this a conversation you had with your team, knowing there have been racial controversies?”

Read More

Tom Daley on his Olympic hopes, recovery routine and how he’s become a ‘double stitcher’ in lockdown

Jeff Bezos’s rocket company mocks Richard Branson’s over ‘small windows’ and its definition of going to space

Virgin Galactic launch date: When is Richard Branson going to space? Everything you need to know as billionaires race to leave Earth