Connect with us

Technology

HR’s fears over use of AI in hiring mount

Published

on

How concerned should HR be about AI apps such as ChatGPT and its competitors? And should they adapt their policies accordingly?

Billionaire, Elon Musk, describes it as “one of biggest risks to the future of civilisation,” capable of unleashing “profound risks to society and humanity.”

For HR directors ChatGPT – the latest generative AI chatbot that has become the fastest growing app ever – is slightly less apocalyptic, but no less worrying. Capable as it is of crafting appealing cover letters in seconds, or assisting job seekers doing remote assessments, it is having a scything impact, and HRDs are naturally worried.

“I can totally see how ChatGPT and other AI applications can interfere with online testing, and assessing for things like written English,” says Nicola Jackson, co-founder and HRD of tech start-up DemoTime.

“For specific roles where knowing what someone’s actual skills are is important, we would absolutely have to alter our policies and revert back to more traditional things like face-to-face interviews and invigilated, on-site assessments.”


More on AI in HR:

HR and AI: How can HR use AI effectively and ethically?

AI in the workplace guaranteed to go wrong, says MP

Just Eat couriers fired by AI, report finds


With recruitment already critcised for being too slow, the notion HRDs will now have to add additional checks/processes to accommodate AI must be an wearying one.

“HRDs are having to rethink how they can mitigate against bad hiring by falling back on live interviews or maybe reinstituting trial periods so people can actually prove they can do what they say they can do,” says Clare Walsh, director of education for the Institute of Analytics and AI Society.

“The problems HRDs face is that AI algorithms know how to present the very best version of a person – so validating people is going to be much more problematic.”

As Neil McGough, chief product officer at assessment tool provider, Learnosity, puts it: “There’s real fear among HRDs about how they’re keeping up with what candidates have on tap to them. ere’s a chunk of fear about how we get ahead of it before things become a problem.”

So what problems could these be? There is anxiety that a brilliant looking application may not be from who it seems.

“While cover letters don’t form part of our application process, I can see how it may worry HRDs, and we’re pre-empting this by getting people to talk about how they would apply their skills,” says Charlotte Gregson, country head of Malt, which matches freelancers to jobs for clients. She now insists candidates take a 30-question workplace psychology test, which assesses what people are like to work with.

To demonstrate this fear Chris Oglethorpe, HRD at law firm, Gowling WLG, devised a test specifically for HR magazine. He asked ChatGPT to write an application letter for a trainee lawyer position with a special interest in intellectual property.

“Within seconds we had a letter covering their capability, recent cases, and their interest,” he says. “At first glance it was extremely credible. Our talent team may not have the time to minutely validate everything that is being said.

“If we relied on process filtering alone, you would absolutely run the risk of recruiters thinking they have people with skills that they may not.”

McGough says HRDs will likely have to investigate running assessment tests with online proctoring – whereby someone physically watches candidates take a test via a webcam – or using locked browsers, where an assessment is run on a browser the applicant has to download.

This then prevents them running any other applications from it. He even suggests changing tests entirely to proving the process for doing something, rather than demonstrating knowledge that can be looked up.

“One process HRDs can change is adding recency,” says Walsh. “By asking someone to write about something from the last few weeks, you can more or less guarantee ChatGPT hasn’t caught up with this yet.”

Other innovations are also starting to come through. “There are OpenAI sites, like ZeroGPT, that can detect the use of AI systems,” says Rupert Deering, co-founder of recruiter, Timberseed. “Software like this will become an important step in the recruitment process, to ensure the authenticity of candidate profiles and applications.”

Oglethorpe adds: “I can see ourselves having to invest in technology like this.”

Sophie Bryan, founder and chief workplace culture consultant at Ordinarily Different says: “One way HR processes will need to adapt to accommodate the uptick in AI is by upskilling HR staff to be able to understand and interpret the data that it generates to make informed decisions and truly get the most out of it.

“As HR leaders, we must ensure we continue to value and prioritise the human element in recruitment.”

With AI ripping up what was thought to be possible it’s quite likely job applications can be written by AI, and then sorted by AI at the other end – the bit that was supposed to make HRDs’ lives easier. For some this can be a mind-boggling proposition.

Of course, AI certainly offers potential – everything from HR chatbots to answer common queries, writing generic HR policies, to even revolutionising online training. Firms such as Synthesia now create realistic AI generated trainers.

All it needs is a script and its avatars will read it. No more expensive studio and shooting time.

“The question we all have to ask ourselves in all of this, especially in recruitment, is whether it’s riskier for technology to do the job than a human,” says Oglethorpe. “Humans have biases too, but at least HR people are trained to recognise they may have biases.”

But there is perhaps one, final food for thought in all this. Maybe HR process does not need to be changed as drastically as some may think. For while Jackson says she has reservations about AI, she’s pragmatic too.

“We’re a software development company. We want people who can solve problems, and people using ChatGPT or other AI applications is a form of problem solving,” she says.

“So, if they’ve applied using ChatGPT, we see it as a sign of using tools to do things better.

“I’m dyslexic, and I use it for drafting emails. Is that so wrong? We want people for what they would do in the real world, and maybe this is the real world now.”

The full article of the above first appeared in the May/June 2023 print issue. Subscribe today to have all our latest articles delivered right to your desk. 

Read the full article here

Trending