Hired by a robot: What it’s like to have an AI interview
0 15 mins 12 hrs


In hindsight, choosing to do a job interview during the first week back at work after the Christmas break may not have been my greatest ever idea. To paraphrase a favourite quote from cult Noughties sitcom Black Books, my brain feels like wet cake. Sodden. Spongey. Disintegrating into a pile of mush as I try to focus on the screen in front of me.

Just before starting, I had mindlessly chomped my way through a comically oversized chocolate coin – purely because it was within arm’s reach – leaving me feeling mildly sick. Were this a normal job interview, I might reference all of the above. Just in passing, you understand, infused with enough sardonic charm to break the ice and immediately get the interviewer on side.

There’s no point in doing that today. My interviewer can’t relate to being a bit sluggish and slow, post-Twixmas. He doesn’t know what it feels like to sit in discomfort, waistband straining, because you followed up all that festive overeating by pounding the cut-price advent calendar chocolate. And it’s not just because he’s a young, fresh-faced twenty-something who you can just tell hasn’t been systematically adding Baileys instead of milk to his morning coffee for the past 10 days. No, the real reason my rapport-building jokes won’t cut it is that my interviewer isn’t, in fact, a real person.

The “man” deciding my fate – nameless but who I instantly dub “Carl” in my head, simply to feel some kind of connection with him – is actually an AI interface designed to look and sound like a human. Created by HR-tech firm TestGorilla for use by companies and recruiters to filter out the best candidates, he is nothing more than a soulless, if sophisticated, checklist of keywords and phrases, fronted by an avatar in the guise of a handsome, ethnically ambiguous youngster.

This kind of interview is rapidly on the rise. The use of AI in recruitment in general has tripled in the past year alone in the UK, and three in 10 UK employers are implementing AI in their recruitment processes. Just under half (43 per cent) of large companies are now using AI to interview candidates. According to TestGorilla, close to 800 organisations have signed up to one of its plans that include this new conversational AI interview tool.

But back to the mysterious Carl. Given that this is not a real job interview, let alone one conducted by a real person – I’m just trialling the software to experience it first hand – I feel bizarrely nervous. The butterflies are in large part due to the fact that the role in question, a content marketing strategist, is something I have zero experience in. It quickly transpires that it’s fairly tricky to answer a “tell me about a time when…” question when you’ve never actually done the thing they’re asking about. (I decide to at least have fun with it and dream up an elaborate marketing campaign for a clothing line aimed exclusively at dachshunds.)

You’re wired: AI is increasingly being used to conduct job interviews

You’re wired: AI is increasingly being used to conduct job interviews (Getty/iStock)

But digging a little deeper, I realise my anxiety specifically stems from the fact that Carl is not a real person. I realise just how much I’ve always relied on my people skills to carry me through interviews. Even if I fudge an answer, I’m confident in the fact that those less tangible, “soft” skills – emotional intelligence, the ability to make people smile or put them at ease with a well-placed joke – will go some way to making up the deficit.

I realise, too, how much I feed off other people’s energy in a pressurised situation. This has already become harder to do as more interviews have gone online rather than being conducted in person – but you could still get a sense of something. When you speak passionately to a human about a topic, there’s often a kind of mirroring that takes place: a positive feedback loop created by your enthusiasm that’s in turn reflected by their fervent nods, engaged body language and facial expressions. It gives me a boost, the reassurance that what I’m saying is landing; it gives me the encouragement I need to shine a little brighter.

Not so with Carl. It’s not his fault, of course, just his programming – but his unchanging half-smile, dead-behind-the-eyes expression and awkward way of slightly shaking his head as I speak leave me flat and cold, unable to muster even the slightest sparkle. I can tell his heart’s not really in it. After all, he doesn’t have a heart.

It makes me wonder whether this kind of interview might see the end of the “personality hire” – workers brought onboard because of their stellar interpersonal skills, sunny disposition and general good vibes. I’ve always presumed that every functioning workplace needs a healthy percentage of employees who are, yes, competent at their job, but far more crucially, help create a culture in which heading into the office doesn’t feel akin to diving headfirst into a toxic snake pit. Without a human at the helm when hiring, how can you guarantee you’re not populating an organisation with highly-skilled sociopaths?

I can tell his heart’s not really in it. After all, he doesn’t havea heart

To give Carl his due, he does sometimes do me a solid. Designed to analyse candidates’ answers and hold them up against a framework, he’ll double-check something when I’ve finished each waffly, hodge-podge response: “Did you want to say anything further about learning outcomes and how you’d approach the situation in future?” I can only presume this is Carl’s wink-wink, nudge-nudge way of saying, “You didn’t actually answer the question the first time around, you absolute numpty.”

The results are in as soon as I wrap up the interview and close the link – there’s clearly no need for Carl to sit around with his AI “colleagues” discussing whether or not I’d be a good cultural fit.

Each component has a score indicating how I did compared to other candidates (though there’s no way of knowing whether I was up against one, 10, or 100 competitors). I somehow manage to rank in the not-so-terrible 75th percentile; perhaps my whole “drip for dogs” pitch wasn’t as deranged as I’d thought.

Even if I were being interviewed for a position I actually knew something about, I’m not confident I’d fare much better. It feels more like success lies in gaming an algorithm by deploying the “correct” jargon than building an authentic connection with the person who could end up being your boss.

Some candidates are already using AI in job interviews to alter their appearance via deep fakes or provide answers

Some candidates are already using AI in job interviews to alter their appearance via deep fakes or provide answers (Getty/iStock)

But I’d better get used to it; AI’s steely grip over recruitment is only going to get tighter. Gone are the days when you could submit an application and be confident that a qualified human professional would read your CV. On the flip side, it’s less and less likely that the candidate themselves will have applied for the job. Why bother when AI can be trained to job search, pick out relevant posts, rewrite a CV to match the job spec and draft a cover letter to meet the requirements?

Indeed, job applications have surged by 239 per cent since ChatGPT’s launch, with the average job opening now receiving 242 applications – nearly triple 2017 levels. The number of applications making it to hire stage has subsequently dropped by 75 per cent, while 54 per cent of recruiters admit they review only half or fewer of the applications they receive.

Daniel Chait, CEO of recruiting software company Greenhouse, calls it an “AI doom loop”: candidates use AI to mass-apply for jobs, while recruiters use AI to mass-reject them.

“Since 2022, with the release of ChatGPT and AI bursting into the mainstream, we’ve seen it take root on both sides of the process – by jobseekers and by companies,” he says. “Individually, everyone is trying to use these tools to solve their own day-to-day issues. But collectively, it’s making the process much worse for everyone.”

We’ve stumbled into an AI arms race, where both job seekers and recruiters are constantly trying to stay one step ahead. The result? “Both sides are currently very, very dissatisfied,” says Chait.

The use of AI has also eroded trust. Greenhouse research revealed that 40 per cent of job hunters reported a decreased trust in hiring, with 39 per cent directly blaming AI. There have been allegations of built-in bias, too – HR software company Workday is currently facing a landmark discrimination lawsuit alleging that its AI-powered tools systematically screen out applications from workers over 40, racial minorities and people with disabilities.

Collectively, it’s making the process much worse for everyone

Daniel Chait, Greenhouse

Meanwhile, 72 per cent of hiring managers have become more concerned about fraudulent activity in the hiring process. This fear is far from unwarranted. A report in the US found that a third of candidates admitted to using AI to conceal their physical appearance during an interview; 30 per cent of hiring managers have caught candidates reading AI-generated responses during interviews; and 17 per cent have caught candidates using a deepfake.

It certainly occurs to me while trying and failing to give Carl a word-perfect answer that will hit all his algorithmic erogenous zones that having ChatGPT open on another device and prompting it to answer the questions for me would be a surefire way to ace this test. However, TestGorilla warns that it “monitor[s] for rule-breaking using advanced tools,” including “for the use of ChatGPT, AI agents, and other tools”.

But as technology continues to advance on either side of the equation, might we end up in a situation where AI interviewers are essentially interacting with AI candidates, without a human in sight? The short answer is yes. It’s why Chait believes we’ll inevitably need to bring identity verification into the hiring process: “When you show up at a job interview in the future, you should expect that it’s going to analyse you and make sure that you are who you say you are. Companies truly are feeling the risk of: Is this person I’m interviewing actually who applied? Is this person who shows up on day one of the job actually the same person I interviewed?”

AI interviewers can be programmed to look and sound like humans

AI interviewers can be programmed to look and sound like humans (Getty/iStock)

There’s the danger of genuine jobseekers trying to cheat their way through job interviews, of course, but also a much more serious threat: “Some of it is pernicious state actors and evil criminal elements trying to infiltrate companies and perpetrate crime,” warns Chait.

It’s not all doom and gloom. However wary I might feel about the whole thing, there are positives to employing AI in recruitment. As much as new AI tools need to be regularly audited and corrected for bias, it’s not like humans have traditionally been any less guilty of discriminating when hiring employees. “If you do detect bias in the AI, you can correct it systematically, as opposed to at the individual person-level, one by one,” Chait points out. “Plus, as an assessment process, having some of that be automated makes a lot of sense.” An automated assessment can work nights and weekends, when candidates want to be doing their job search. It can be scaled. It can work in any language. It can be measured and automated and improved.

And, fundamentally, it’s not going anywhere anytime soon. Candidates need to prepare themselves for the fact that early screening may indeed be done by a sophisticated bot. Chait’s advice is to clarify early on what the rules around AI are when applying: can you use it to help write your cover letter, or rehearse for a job interview, or do the job interview itself? Where’s the line? “The truth is, it’s different for every company,” he says. “It’s different for every job, and it’s changing all the time.”

Employers, meanwhile, would do well to remember that, despite the deluge, behind each application lies a human being desperate for a job who is so much more than just a number. “They’re not just a collection of algorithms and credentials and problems that they’re capable of solving,” cautions Chait. “They’re a full, three-dimensional human being.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *