Advertisement
UK markets closed
  • FTSE 100

    8,213.49
    +41.34 (+0.51%)
     
  • FTSE 250

    20,164.54
    +112.21 (+0.56%)
     
  • AIM

    771.53
    +3.42 (+0.45%)
     
  • GBP/EUR

    1.1652
    -0.0031 (-0.26%)
     
  • GBP/USD

    1.2546
    +0.0013 (+0.11%)
     
  • Bitcoin GBP

    50,915.24
    +2,779.45 (+5.77%)
     
  • CMC Crypto 200

    1,359.39
    +82.41 (+6.45%)
     
  • S&P 500

    5,127.79
    +63.59 (+1.26%)
     
  • DOW

    38,675.68
    +450.02 (+1.18%)
     
  • CRUDE OIL

    77.99
    -0.96 (-1.22%)
     
  • GOLD FUTURES

    2,310.10
    +0.50 (+0.02%)
     
  • NIKKEI 225

    38,236.07
    -37.98 (-0.10%)
     
  • HANG SENG

    18,475.92
    +268.79 (+1.48%)
     
  • DAX

    18,001.60
    +105.10 (+0.59%)
     
  • CAC 40

    7,957.57
    +42.92 (+0.54%)
     

Will AI make the hiring process fairer – or will it worsen discrimination?

Group of young business people sitting in chairs and waiting for an interview
AI can't tell you whether you will enjoy working with someone. Photo: Getty (Paperkites via Getty Images)

AI now plays a major role in recruitment too. Some employers are using AI tools to assess job candidates’ qualities via skill and personality assessments and match them up to vacancies. Others are using AI to screen job seekers and determine who will reach the interview stage – and who won’t.

And while these tools appear to make the hiring process quicker and smoother, relying too heavily on AI in recruitment could lead to big problems – like undermining fair hiring practices and exacerbating discrimination.

AI’s track record when it comes to hiring is sketchy. In 2018, Amazon was forced to scrap an experimental recruitment tool, which used AI to evaluate suitable job candidates, as it was revealed to favour male candidates because the hiring data used to train the tool was dominated by men. Therefore, women were deemed incompatible candidates and rejected.

ADVERTISEMENT

In 2020 another popular AI tool created by HireVue had to scrap visual analysis from its assessment models after facing a backlash from a prominent civil rights group in the US.

HireVue's chief data scientist Lindsey Zuloaga, at the time, said in a statement that the goal of the assessment models was to correlate an applicant’s interview to how they would perform in a specific role and that they were validated and tested continuously. Their research, Zuloaga said, subsequently led to the company deciding in 2020 to not use any visual analysis in their pre-hire algorithms going forward. "We recommend and hope that this decision becomes an industry standard."

Read more: Why pushing employee resilience may do more harm than good

In a study by the US National Bureau of Economic Research, 83,000 fake applications were submitted to entry-level job openings for 108 Fortune 500 companies. Applications with distinctive Black names – such as Ebony, Darnell and Hakim – were less likely to receive a response from employers compared to white names like Heather or Bradley.

Bad data, bad outcomes

Della Judd, a career coach and author of the book Get the job you really want in a post-pandemic world, says AI algorithms are only as good as the data they are given and the programming they receive. Therefore, it’s easy for algorithms to reproduce bias from the real world.

“If the data has flaws – or is already showing trends of bias – then the output data will likely perpetuate those biases you are looking to avoid,” she says. “Great care will be needed to ensure that the right questions are asked of the AI tool and that there are checks and measures in place to review the output.”

There is also a need for human intervention in the recruitment process too, Judd adds. “Some things cannot be measured by data alone, and some data on certain soft skills or attributes are not available to be tracked and measured,” she says.

Bias in the screening process

Studies show that AI does perpetuate biases across gender, age, race, and people with disabilities and dialectic or regional differences of speech. Women often downplay their skills on resumes, while men tend to exaggerate them and include phrases that are tailored to the job opening. Therefore, their CVs may stand out to an algorithm – regardless of their actual relevant skills and experience.

Charlotte Schaller, partner and head of Aon Assessment UK at the management consulting firm Aon, says AI gives people the opportunity to cheat the system. “Candidates may try to give the answers they believe to be correct,” she says.

Read more: Flexible Working Bill - What does it mean for workers and employees?

“Therefore, implementing systems to verify the accuracy and authenticity of candidate data is key. Don’t just rely on one assessment method, implement random further testing and also have human oversight. AI can assist in reducing bias and streamlining the recruitment process, but human judgement and expertise is crucial.”

Used with caution, AI can help make the hiring process more straightforward. It allows recruiters to sift through applicants quickly and identify their skills. However, it’s important to remember that AI can lead to unfair judgements and biased decisions – much like humans.

Keith Spencer, a career Expert at FlexJobs, says businesses should consider the diversity of backgrounds and points of view of those who created and designed the AI tool. “Before you use any AI in recruitment, it’s essential to make sure your HR team and organisation have thoroughly audited what will be measured during recruiting practices,” he says. “As is the case with any AI tool, human oversight is heavily required.”

AI isn’t enough to fix the problem

Crucially, using AI alone isn’t enough to fix a problem that runs far deeper – and companies need to actively work on reducing bias when hiring. “Training of recruitment managers and hiring managers is essential,” says Judd. “Unconscious bias is something we all have and if we are aware of it then we can start to make amends for those traits, good diversity and inclusion training is a great place to start.”

Introducing anonymous CVs or applications is another helpful approach. “This was something that was successful where I used to work. Names, locations and education history were removed before the hiring manager saw the CV, so that the skills and experience of the person were the only thing to assess.”

Finally, if AI does play a part in the process, it’s essential to track diversity data. “You can insist on more diverse application pools and you can track how well this is working by looking at your stats for various minority areas,” she adds. “Once you track the data you can start to make adjustments. You might find that you still have a low level of women applying for certain areas of your business, so reviewing your job descriptions to ensure they have a wider appeal can then be done.”

Watch: UK study points to benefits of 4 day work week

Download the Yahoo Finance app, available for Apple and Android.