For the past seven years, George has used a handful of recruiting websites regularly. After the pandemic spurred more flexibility on locations, George’s inbox flooded with job recommendations, many of which were irrelevant, he said. When he did find an interesting opening, he often didn’t hear back.
Job seekers like George say they’re increasingly struggling to find positions and cut through the hundreds of applications targeted at one job on popular sites like Indeed, ZipRecruiter and LinkedIn. Many are convinced that job-matching features powered by artificial intelligence on these sites don’t always work in their favor. In some instances, AI recommendation systems are using historical hiring information such as whom employers have typically messaged, liked and interviewed; their past searches; and the profiles they’ve clicked on to match or rank candidates — not just a candidate’s qualifications.
Experts who study AI say the way candidates are matched may get more complicated as more-advanced AI is added. That could mean some applicants won’t get recommended as the best match for a job solely based on qualifications.
“It’s an area we’re very concerned with. If you’re denied access to jobs … it makes a big impact on” people’s livelihoods, Adriano Koshiyama, co-chief executive of AI governance, risk and compliance at software company Holistic AI, said in referring to job sites in general.
Kristin Randle, an IT project specialist in Sarasota, Fla., said she often felt she was getting buried in job sites systems and was discouraged by the huge number of candidates she’d see applying to one job.
“It became so frustrating, I finally gave up,” she said, adding that she ultimately took a full-time job with an employer she’d previously contracted with.
LinkedIn, Indeed and ZipRecruiter had the most traffic among job sites in the United States in the first six months of this year, collectively attracting nearly 790 million visits, according to web analytics firm Similarweb. Their goal is to quickly match candidates with relevant job postings they’re qualified for or may not have previously considered. Similarly, they hope to reduce the time employers spend searching for and hiring workers.
The job sites say their AI matching systems aren’t always perfect. Although candidates won’t be disqualified or rejected based on an employer’s previous activity on the site, it could play into the AI recommendation.
“Someone might feel qualified, but based on past hiring trends, that person could not bubble up to be first,” said Scott Dobroski, Indeed’s career trends expert. “But that could also happen with the old model [of hiring] with just a human.”
ZipRecruiter’s AI bases its decisions on the billions of data points it has gathered from employers and job candidates over the past 13 years it’s been in operation. It mimics what employers of specific sizes and types have typically done on its site.
“The machine learning is based on human behavior,” said Jen Ringel, senior vice president of product at ZipRecruiter. “So it’s on us to do as much as we can to educate employers. But it is complicated.”
The sites’ algorithms factor in a candidate’s skills and experiences and, in some cases, jobs they’ve searched for, clicked on or applied for. Some sites’ algorithms identify best matches for employers based on previous searches, the type of candidates they usually seek more information from, and who previously did well. It’s unclear how much weight algorithms give to skills and qualifications compared with behavior on the site.
“We’re trying to do as much as possible to eliminate as many of the steps of a pretty broken hiring process and get job seekers and employers talking quickly,” Ringel said.
But the potential for bias based on employers’ previous recruiting behavior is very real and sometimes hard to identify because it could be entangled with statistical correlations, said Manish Raghavan, assistant professor of information technology and computer science at the MIT Sloan School of Management. For example, AI could appear to be biased in matching mostly Harvard graduates to some jobs when those graduates may just have a higher likelihood to match certain requirements. Humans already struggle with implicit biases, often favoring people like themselves, and that could get replicated through AI.
“It’s very difficult to prevent that from happening,” Raghavan said, adding that most services are aware of the potential problems. “But there isn’t a universal solution.”
AI also has the potential to create rules based on historical patterns without anyone knowing what those rules are, making it difficult to fix, he added.
Sameer Maskey, adjunct associate professor of machine learning and AI at Columbia University, points to Amazon as an example of how AI could propel bias in hiring. In 2018, the tech giant junked its experimental AI recruiting tool because it favored men for technical jobs, as most previous technical hires were men, according to Reuters. It takes a concerted effort to ensure that training data isn’t already biased, he said. At the time, Amazon said its recruiters never used the tech for actual candidates.
(Amazon founder Jeff Bezos owns The Washington Post, and interim CEO Patty Stonesifer sits on Amazon’s board.)
“If you’re just blindly using historical data and not carefully looking into if bias has crept into your hiring system, then it just becomes a biased AI system,” Maskey said of hiring software. “Candidates are right to be worried.”
The platforms say they take steps to fight algorithmic bias. ZipRecruiter regularly retrains its systems with new data, giving more weight to current data, and scrubs for personally identifiable information like full names and gender, Ringel said. About 300 people work on AI and machine learning for Indeed’s matching, and its director of AI ethics focuses on this issue full time, it said.
Hired.com, a site heavily focused on sales and tech jobs, uses AI matching to help employers find candidates but not to match openings to job seekers. Hired said it demotes qualified candidates only if an employer’s activity suggests it’s not interested. Its system allows employers to hide candidates’ names and profile pictures, and is audited annually by Holistic AI, which technically assesses whether a company has mechanisms to prevent, detect and correct algorithmic bias and how effective they are.
“Every employer is benefiting from the overall training of our models,” said Dave Walters, chief technology officer at Hired. “Even if there were a small subset of our employer population that had some underlying bias in their searches, the overall model is getting trained across everybody’s data, which is preventing any significant shift.”
LinkedIn says it builds tools to promote equitable outcomes. In 2018, the company tweaked its recruiter search tool to better reflect the gender mix of candidates on each results page. Then last year, it debuted a feature that nudges recruiters to broaden their filters if less than 45 percent of the results are male or female.
Job sites may soon add more-advanced technologies including generative AI, which could make the potential for bias more complicated to find and fix.
Indeed says that job matching is the future of hiring and that generative AI could help. The company partnered with Google to explore using the generative AI tool Bard for better matching and to help candidates understand what to prioritize on their profiles, Dobroski said. Ringel, of ZipRecruiter, sees a future where generative AI could be used to explain why the service recommends certain jobs, for interview prep or even to help label training data for its AI models. Both see it as a way to make writing résumés and job descriptions easier.
But advanced AI could also make it harder to understand why systems make the decisions they do.
“Many HR tech clients … use decently explainable algorithms,” said Koshiyama, of Holistic AI. “Three years from now, I’d be concerned about transparency.”
Job seekers should prepare to see AI in more areas of the hiring process, said Columbia’s Maskey, who is also CEO of AI company Fusemachines. And good training data and checks and balances for bias will become even more important.
“It will get to a point where if there were a thousand applicants, [the AI] might interview all 1,000 over Zoom,” he said. “Be ready for it.”
As for George, he says he’s had much more luck contacting people directly rather than relying on these job sites.
“There’s good and bad in them,” he said. “I won’t quit using them, but I won’t put all my chips on them either.”