Workplace Wellbeing: Is AI falling down on the job?

Companies increasingly use artificial intelligence to screen job applicants quickly and for initial selection. However, experts warn qualified people could be passed over because of an inbuilt bias
Workplace Wellbeing: Is AI falling down on the job?

Pic: iStock

IF Steve Jobs were to apply for a position today, he would likely find it difficult to get an interview. The CV of the entrepreneur, businessman, and co-founder of Apple would probably be cast aside by the artificial intelligence (AI) tools now used to recruit and select employees.

So says Hilke Schellmann, an award-winning American journalist and author of The Algorithm, which explores how AI is increasingly being used to hire employees.

“Jobs didn’t complete college and sometimes had long breaks in his employment history,” says Schellmann.

“These factors could lead CV screeners to reject his job application instantly.”

A 2023 survey of 1,700 managers in Ireland, Britain, and Germany by the tech company Greenhouse Software found that 48% use CV screeners to search CVs for specific keywords to determine which candidates to call for an interview.

Some 43% of managers then invite the selected candidates to one-way interviews. In this process, employers set questions, the candidates record their answers, and the software analyses their answers to decide which candidates proceed to the face-to-face interview stage.

Hilke Schellmann: “I have found gender, race and disability bias in many [AI] tools." Pic: Jennifer S Altman
Hilke Schellmann: “I have found gender, race and disability bias in many [AI] tools." Pic: Jennifer S Altman

Schellmann sees the attraction of such AI tools.

“Online job platforms and digitisation have made it so easy for job seekers to apply for positions that large companies can get millions of job applications per year,” she says. “These tools allow them to process the applications efficiently while saving on labour costs. They also promise they are bias-free, which allows them to pick the most qualified candidates.”

Dr Na Fu, a professor of human resource management at Trinity College Dublin, reports that these tools are transforming the recruitment sector.

“Unilever used to recruit from 840 US colleges, but AI allows it to recruit from triple that number,” she says. “The Hilton Group has said that AI has made its job recruitment process 90% faster. If these tools mean you can recruit from a wider talent pool while saving time and money, it’s no wonder employers are so eager to use them.”

But Schellmann isn’t so sure that these tools deliver on their promise. She argues that some are based on poor science.

“For example, some measure candidates’ facial expressions during one-way interviews,” she says. “The idea that our innermost thoughts are revealed through our facial expressions was shown to be a pseudoscience decades ago. There is no evidence to show what facial expressions are causally related to being able to do one’s job successfully.”

Schellmann also believes the algorithms underpinning many of these tools are biased because the data they were built upon reflects the preconceptions of those who designed them.

“I have found gender, race and disability bias in many tools,” she says. “For example, one asked job seekers to hit the spacebar as fast as possible. Someone with a motor disability might be unable to fulfil this particular task, which doesn’t seem related to anything they would ever be asked to do in the workplace.”

Professor Na Fu, TCD
Professor Na Fu, TCD

Battling in-built bias

Fu acknowledges that bias is a concern, citing Amazon’s failed attempt to use AI in its recruitment process. Its algorithm used CVs submitted to the company over a 10-year period to learn how to spot the best candidates. But it didn’t factor in the low proportion of women who were working for the company at that time.

“This meant the algorithm was fed with male-dominant recruitment words such as ‘competitive’ and ‘leader’, making less likely to select women,” says Fu. “Amazon eventually had to abandon that AI recruitment tool.”

However, humans can be equally biased when left in charge of recruitment. In the Greenhouse Software survey, 56% of HR managers said they would be more likely to hire someone from the same background as them.

The difference is that there’s only so much impact a single HR manager can have on an organisation’s hiring patterns. An algorithm used across the organisation or multiple organisations is much more influential. It can affect hundreds of thousands of prospective candidates.

Fu is part of a team at Trinity College that is collaborating with teams at Maynooth University, University College Dublin and the Royal Melbourne Institute of Technology in Australia to reduce the risk of in-built bias in AI.

“We need stakeholder co-creation, where all the different stakeholders are involved in the design stage of algorithms, not just the tech designers,” she says. “If employers, managers and employees are are involved in continuous dialogue, AI tools will become more ethical and reliable.”

Managing AI safely

In the meantime, Mary Rose Lyons, the founder of the AI Institute, suggests ways for employers to safely incorporate AI into their recruitment processes.

“Many tools can improve workflow by automating some tasks and getting better-quality output in others,” she says. “For example, recruiters can use ChatGPT to write job descriptions. If we learn how to use these tools constructively, they can provide us with shortcuts that save us time and labour.”

She defends the use of CV screeners.

“If you receive hundreds or even thousands of job applications, there’s no way a human will be able to assess every single one of them,” she says. “A CV screener can be formatted to do it for you. You just need to be careful in how you instruct it.”

It’s all about ensuring that efficiency doesn’t come at the expense of fairness, according to Schellmann, who advises taking a cautious approach to AI.

“Think long and hard about why you want to invest in particular tools,” she says. “Ask yourself what problem you’re trying to solve and if this tool will help. Then find out how the tool was designed and tested to ensure it doesn’t discriminate. I’d also urge HR managers to test the tools themselves to see how they perform.”

Fu recommends experimenting with AI.

“AI is there to stay, and we shouldn’t be afraid of it,” she says. Instead, we should experiment to see what tools work best for us. Compare a job description written by ChatGPT with the one you wrote. If it’s just as good, perhaps you can save time by getting AI to do that task from now on.”

Lyons reassures those who worry that AI tools might work against them, just as they might have discriminated against Steve Jobs.

“You too can leverage this technology to your advantage,” she says. “ChatGPT can hone your CV so that it matches the job description. It can suggest commonly asked interview questions and possible responses. There are so many tools out there to help you — just take a look at www.theresanaiforthat.com.”

We’re in the early stages of incorporating AI into recruitment, and some problems need to be addressed.

“We don’t want to go back to using human hiring managers as they have a lot of unconscious bias, but nor do we want to build tools in which these human biases are reproduced so that they discriminate at scale,” says Schellmann. “We have to start building better, more transparent tools.”

While we wait for those to be developed, the consensus is for employers to proceed with care, experimenting with the tools available to ensure that they work for you.

As for prospective employees? They can pitch the AI tools available to them against the AI software used by recruiters, levelling the playing field in their favour. You can be sure that’s what Steve Jobs would have done.

More in this section

Lifestyle

Newsletter

The best food, health, entertainment and lifestyle content from the Irish Examiner, direct to your inbox.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited