More and more companies are using AI to hire and hire new employees, and AI can influence that at almost any stage of the recruitment process. Covid-19 has spurred new demand for these technologies. Both Curious thing i HireVue, companies specializing in artificial intelligence-based interviews, reported an increase in business during the pandemic.
Most job searches, however, start with a simple search. Job seekers turn to platforms like LinkedIn,, A monster, or ZipRecruiter, where I can upload my resumes, review job postings and apply for vacancies.
The goal of these websites is to match qualified candidates with vacancies. To organize all these openings and candidates, many platforms use referral algorithms based on artificial intelligence. Algorithms, sometimes referred to as matching mechanisms, process information from both the job seeker and the employer to compile a list of recommendations for each.
“You usually hear the anecdote that a recruit spends six seconds looking at your resume, right?” says Derek Kahn, vice president of product management at Monster. “When we look at the referral mechanism we’ve built, you can reduce that time to milliseconds.”
Most matching engines are optimized to generate applications, he says John Jersin, former vice president of product management at LinkedIn. These systems base their recommendations on three categories of data: information provided by the user directly to the platform; data assigned to the user based on others with similar sets of skills, experiences and interests; and behavioral data, such as how often a user responds to messages or communicates with job postings.
In the case of LinkedIn, these algorithms exclude a person’s name, age, gender, and race, because the inclusion of these characteristics can contribute to bias in automated processes. But Jersin’s team found that even so, service algorithms can still detect patterns of behavior displayed by groups with a particular gender identity.
For example, while men are more likely to apply for jobs that require work experience outside of their qualifications, women generally opt only for jobs where their qualifications meet the requirements of the job. The algorithm interprets this variation in behavior and adjusts its recommendations in a way that inadvertently puts women at a disadvantage.
“You may recommend, for example, more older jobs to one group of people than another, even if they are qualified at the same level,” says Yersin. “These people may not be exposed to the same opportunities. And that’s really the impact we’re talking about here. “
Men also include more skills in their resumes with a lower level of expertise than women, and often enter into a more aggressive relationship with recruits on the platform.
To address such issues, Yersin and his team at LinkedIn built a new AI designed to result in more representative results and applied it in 2018. It was basically a separate algorithm designed to counter recommendations distorted towards a particular group. The new AI ensures that the referral system includes an even distribution of users by gender before referring to matches prepared by the original engine.
Khan says Monster, which lists 5 to 6 million jobs at any given time, also includes behavioral data in its recommendations, but doesn’t correct bias in the same way that LinkedIn does. Instead, the marketing team focuses on getting customers from different backgrounds to apply for the service, and the company then relies on employers to report and tell Monster whether it has forwarded a representative range of candidates or not.
Irina Novoselsky, CEO of CareerBuilder, says she is focused on using the data the service collects to teach employers how to eliminate bias from their job ads. For example, “When a candidate reads a job description with the word ‘rockstar,’ a materially smaller percentage of women apply,” she says.