Evolving tech and talent shortages complicate candidate screens, but recruiters can still widen the talent pool and bring a human perspective to assessments.
the_post_thumbnail_caption(); ?>
Screening is more than a phone call that advances a candidate to “the next round.” Weeding out poor-fit candidates or, as is the tendency these days, screening in candidates with potential, is part of almost every stage of hiring. And the proliferation of automated hiring tech and low unemployment have further complicated the process.
“I’d say over the past 25 years or so, there’s been an increased use in third party software to assess applicants and review their materials before they get seen by a human hiring manager,” said Daniel Greene, assistant professor at the University of Maryland College of Information Studies. “We’ve also seen increasing automation of not just these hard facts — did someone actually work at this job at a specific time? — but of so-called soft skills, whether someone’s personality might make them a good fit for an organization’s culture, whether they’re friendly with customers, that sort of thing.”
These shifts have forced talent pros to evolve their approach, sometimes to confront barriers that screen out qualified candidates. It’s a tricky situation for both new recruiters and experienced talent managers, so we asked experts for their thoughts for this installment of the Talent Textbook.
When applications work against you
If an application is built thoughtfully, it can screen out candidates that are wrong for the job upfront and engage as many qualified candidates as possible, said Micah Rowland COO of Fountain.
“For instance, if for a given job we know that a person has to have a commercial driver’s license, in our software you can put that question first so that applicants who don’t have one don’t spend a lot of time,” said Rowland, whose software caters to retail, food service, hospitality and other higher-churn industries.
Using a third-party platform to automate follow-up messages can keep qualified applicants from falling through the administrative cracks and save time, too, Rowland said. In certain contexts, however, hiring platforms can inadvertently screen out certain groups, like workers with disabilities, according to Greene. When hiring protocols — like which ATS to use — are established at an organization’s HQ and then carried out uniformly across franchises or locations, local hirers are less able to adjust job descriptions.
“When I was a social worker working with folks with mental illness to get housing and jobs in the community, a problem that we had is we’d often talk to local managers and say, ‘The person I’m representing is not super mobile, but they’re super good at this and this. The job you have listed on your site is not a good fit for them, but they would be really good if we mixed and matched duties from these multiple job descriptions,'” Greene said; “That’s a little harder when you can’t talk to the local manager when you’re submitting your application.”
Though this situation differs from intentionally screening out candidates from protected classes, or using implicitly biased language in a job description, the result is the same: the rejection of diverse candidates.
Hundreds of resumes, three folders
Talent pros need to be aware of biases tech can introduce in screening, especially when hiring platforms enlist algorithms to sort applications and resumes. This tech typically slots candidates into a “green-light,” “red-light” or “yellow-light” folder for human hirers to review, Greene said.
“It’s not usually hiring being automated, it’s — to my mind — always still a person making that final decision, it’s really the rejection of people who don’t fit that’s being automated,” he said, referring to the red-light folder.
Trusting an algorithm’s logic unquestioningly can lead to discrimination. An Amazon recruiting algorithm ran into this issue because it was engineered with past hiring data that reflected a gender bias within the company, Reuters reported last year. As a result, the tool often screened out female candidates.
“You could guess that a machine wouldn’t have the same racism or sexism of an individual person, but if all the machine is doing is learning from the hiring decisions of an old manager, or thousands of old managers, then they’re just going to make those same decisions, but faster,” Greene said.
With Amazon’s case in mind, some continue to iterate on algorithms that sort candidates — based on an employer’s hiring criteria — to save recruiters time and lessen the administrative burden. VCV, a recruiting software company, is doing just that with an algorithm that analyzes candidates’ facial and vocal data during automated screening calls. CEO Arik Akverdian said its AI can analyze responses to questions to better predict candidates’ truthfulness or outgoingness based on their facial expressions.
“We as a technology, we don’t make the decisions, we just filter the candidate, and then the company makes the decision,” said Akverdian. “I don’t think there is anyone in the world right now who can say that AI is as smart as humans,” he continued. “We have a human brain for now to check it as a second step.”
For talent pros considering screening with AI, Greene recommends asking the developer: “What is your algorithm’s training corpus, and how are you cleaning it and teaching it to eliminate bias over time?”
Checking human bias
When it comes time for humans to peek inside the folders, HireStrategy SVP of HR Jen Wright suggested adopting a screening-in approach to keep the candidate pool large and diverse.
“I don’t think we need to be spending a whole long amount of time analyzing that resume. You can kind of glance through and see if you’re looking at education if it’s a basic fit, but you need to keep an open mind and a broad perspective, especially if you want to cast a wider net right now when talent is hard to find,” she said.
Though tech can administer bias on a larger scale, human hirers have their flaws, too. Talent professionals are educating themselves on unconscious bias and pivoting from assessing candidates based on politeness or other superficial factors.
“It could be a weak handshake or even a school you don’t like because your ex-boyfriend went there or something silly like that,” Wright said. “I think the first step is just being aware. In this market we cannot screen off of biases. We need to know our key performance indicators and come up with questions that will be relative to the job based on them.”
If those KPIs come merely from personality tests or background checks, it could be problematic. Data scientist Cathy O’Neil in her book Weapons of Math Destruction discussed how personality tests driven by algorithms have excluded candidates with mental illness. Similarly, Wright warned against disqualifying formerly incarcerated candidates for a criminal record that is not germane to the job.
Skills tests for specialized jobs, on the other hand, might cause candidates to screen themselves out if they come up too early or too often, Wright said
“There are 15 others of the same type of position where you don’t have to do that,” she said of pre-hire assessments. “[Expecting candidates to do] 14 different assessments and have a clean background and a credit check? … Good luck.”
Workforce trends and tech du jour aside, talent pros who look ahead may be best equipped to balance efficiency and inclusivity when screening in the future.