ChatGPT and AI generally are changing the HR and recruitment landscape. But is automation really the solution to everything in such a human-focused industry? With industry expert Matt Burney on board, we explore some of the challenges recruiters are facing in adopting new AI technology, as well as what the future holds.
What ChatGPT is (and what it’s not)
ChatGPT is a language model which can build sentences in human-like fashion based on predictive probabilities it's been trained on. Nowadays, recruiters can create ‘human’ sounding text through ChatGPT. While the app is a language model, not a knowledge model— anyone who uses ChatGPT will still need to check for accuracy. That's very important if you’re using it to write content about your employer brand.
ChatGPT has strengths and weaknesses for recruiters and candidates alike. Natural language processing tools like ChatGPT are relatively new to recruiters in this capacity, but by harnessing it in the correct way, it might help rather than hinder your processes.
The recruitment landscape and GPT language models
There are plenty of ways in which a language model such as ChatGPT can be used to streamline recruitment processes. Take, for example, using a prompt to write a job description, advertisement, or interview questions
Recruiters can input a prompt like ‘create 10 questions for a competency-based interview,’ and the AI tool will provide some text-based solutions. They can also use it to generate popular keywords for a recruitment description or advertisement, potentially making them more targeted towards relevant candidates.
With businesses competing more intensely in an increasingly mobile marketplace, recruiters are having to turn to new tools to fine-tune their processes. We've found that building a network of talent is one great way to hire within a smaller-than expected talent pool, and it looks like AI recruitment might help them become more competitive in that space.
ChatGPT, among other assets, is a perfect tool for recruiters who want to automate processes, saving them about 35% of the time spent on recruitment in a week, says Matt Burney, Senior Strategic Advisor at Indeed and AI expert. He’s already finding that automation in general is saving organizations from being overwhelmed by growing HR data collections.
So outside of employer brand and job advertisement writing, where else can a natural language processing model become useful to recruiters? Burney says that ChatGPT:
“... can be used to handle a lot of unstructured data… such as interviews. [...]. If you go through an interview, everyone does it differently. There are very few uniform ways to interview people.
'But if you use a language model to look at those interviews, to take notes and structure those notes and understand what worked and what didn’t work, you can really start to understand quickly who is good at interviewing, who is bad at interviewing, what different people look like, what different behaviours look like.”
However, there are some potential consequences that come with providing a third-party application with unstructured data. The main issue is privacy: It’s not clear where this information goes after input. If you’re inputting a company spreadsheet as a prompt into a third-party language model, you never know where this information is going. You also might not know how it'll be used in the future.
Potential compliance issues and ChatGPT
You could open yourself to potential data leaks and data privacy compliance issues. Burney advises that:
"whenever you’re using a GPT model, you shouldn’t be putting any personal information into it. You shouldn’t be putting any company information (sic) into there. I’ve seen people say: “Oh, I've put a massive spreadsheet into one of these tools”. Now who owns that data?
'That’s data on your business, your need, your demand plan, your finances, all sorts of stuff. The risk is pretty obvious. And I think GDPR is probably the most obvious thing that you’re going to run into a compliance issue with'."
One of the clearest considerations, therefore, for companies whose staff are using language models is making sure that they are well-trained in compliance, data protection and PIPEDA. Canada’s lawmakers have been keeping a watchful eye on AI developments. For instance, the Privacy Commissioner of Canada is investigating ChatGPT following receipt of a complaint claiming that it unlawfully collected, used, and disclosed personal information without consent.
One possible solution to this issue is for companies to focus on developing their own AI tools to manage unstructured data. It creates a ‘walled garden’ approach to managing company data. All the data is still held within the company's own database without third party access, making them less susceptible to data leaks.
Other tips include not using information from personnel reviews as a ChatGPT prompt — and always keeping a human in the loop. The same goes for client or customer data or any confidential company information.
Candidates are using ChatGPT to create job applications: What are the implications?
According to HuffPost UK, it’s already becoming difficult to tell the difference between an application written by a person and one written by ChatGPT. But when it comes to the interviewing round, their application might not truly represent candidates and their communication skills
While he’s sympathetic to candidates that have been struggling to secure interviews and write their application, Burney suggests that it might be more effective to see AI language models as a ‘copilot’ rather than as a tool to craft an entire application.
He advises that: "... it’s not replacing anything that you’re doing. It’s merely a tool to help you, and spot problems that you might not have seen… It can help you [with rewording], [adding in keywords] etc’."
From a recruiter’s perspective, it may also be wise to consider whether or not to filter or screen out candidates who have used a tool like ChatGPT to write their application.
Recruiters might choose to strike a compromise with candidates instead. This could look like accepting applications which have been edited but not fully written with the help of ChatGPT.
Employer brand and ChatGPT
Another big ethical consideration for companies is the fact that ChatGPT can provide a picture of your employer brand which isn’t actually representative of your company. This might be a problem during the recruitment process, if you’re not providing a clear representation of your brand to candidates who might be interested in your company.
When using AI copy, Burney suggests to consider whether the copy says anything about who you are as a business, or whether you’re ‘just putting together buzzwords that will resonate with people to try and get them over the line.'
The future of AI in recruiting
So what do recruiters need to take on board for the future if they choose to automate their recruitment processes? They’ll have to consider the consequences of automating processes. Burney describes it as a ‘garbage in, garbage out’ problem, and emphasizes that:
"If you think about it really. We can feed it the wrong thing and it’ll do a bad job, if we feed it the right thing, it will do a good job. But we also need to consider: when we’re doing it, why we’re doing it, and how we’re doing it.
Because there is a very human element in recruitment. We need to avoid dialling out the human element, and make sure that when we create new tools or adopt new tools, that we’re doing that to go and automate the manual processes, to free up time and to allow people to go off and do the human piece of work."
ChatGPT: It’s what recruiters make of it
While ChatGPT provides many solutions to problems that recruiters are facing, the quality of the information we feed it matters. Training staff in using it correctly — and in a way that’s compliant with data regulations — is key to making sure that you get the most out of the tools available.