Advertisement
UK markets closed
  • NIKKEI 225

    38,460.08
    +907.92 (+2.42%)
     
  • HANG SENG

    17,201.27
    +372.34 (+2.21%)
     
  • CRUDE OIL

    82.98
    -0.38 (-0.46%)
     
  • GOLD FUTURES

    2,334.70
    -7.40 (-0.32%)
     
  • DOW

    38,485.46
    -18.23 (-0.05%)
     
  • Bitcoin GBP

    51,621.05
    -1,816.69 (-3.40%)
     
  • CMC Crypto 200

    1,386.25
    -37.85 (-2.66%)
     
  • NASDAQ Composite

    15,710.94
    +14.30 (+0.09%)
     
  • UK FTSE All Share

    4,374.06
    -4.69 (-0.11%)
     

'Racist' passport photo system rejects image of a young black man despite meeting government standards

Joshua Bada's online passport photo application - PA
Joshua Bada's online passport photo application - PA

Afacial detection system used by the Government rejected a passport photograph of a young black man because it mistook his lips for an open mouth.

Passport photographs require the subject to have a plain expression with their mouth closed.

Joshua Bada, 28, shared images online of the rejection of his passport photo by the government system.

"When I saw it, I was a bit annoyed but it didn't surprise me because it's a problem that I have faced on Snapchat with the filters, where it hasn't quite recognised my mouth, obviously because of my complexion and just the way my features are," he said.

ADVERTISEMENT

Mr Bada was forced to explain to the gov.uk website that he still wished to submit the photograph.

He explained in a comment box on the website: "My mouth is closed, I just have big lips."

"After I posted it online, friends started getting in contact with me, saying, it's funny but it shouldn't be happening,” he said.

The Race Equality Foundation said it believes the system was not tested properly to see if it would actually work for black or ethnic minority people, calling it "technological or digital racism".

Samir Jeraj, the charity's policy and practice officer, said: "Presumably there was a process behind developing this type of technology which did not address issues of race ethnicity and as a result it disadvantages black and minority ethnic people."

This is not the first example of biased algorithms failing to properly recognise images of black and ethnic minority people.

Earlier this week, an online image database which uses 14m images as a training guide for artificial intelligence systems was found to have classified photographs of black people with  labels such as “blackamoor”, “negroid” or “black person”, while results from caucasian faces varied more widely as “researcher”, “scientist” or “singer”.