I often get caught up in high-level conversations about heady topics such as recruitment marketing strategy and the consumerization of enterprise technology — all from the relative safety of my analyst bubble. From time to time, however, I come across something that brings me back down to earth, and reminds me that the smallest things we do in talent acquisition can have serious repercussions.
In this instance, I’m talking about candidate experience.
Last week, someone forwarded to me an article on degree inflation. The article examines a list of occupations that have shown the most “up-credentialing” over the last five years, compiled by the folks at Burning Glass, to determine what’s driving this trend.
Most interesting to me were three sentences my friend had highlighted:
“… it seems as if more employers are using bachelor’s degrees as a signal of drive or talent, regardless of the relevance of the skills actually learned in college.”
“There’s also still an oversupply of workers, so employers know they can afford to be picky.”
“… it’s not clear why a college-level education would suddenly become more important — except maybe as a sorting device for narrowing down the deluge of résumés to the most qualified (or overqualified) applicants.”
Within the analyst bubble, we talk about how important candidate-friendly workflows, easy one-click applies, and mobile access to career portals are for creating a positive candidate experience. And while there are a multitude of factors that drive candidate experience, articles like this one (and the attitudes it purports) tell me we’re missing the mark on some very basic levels.
Allow me to explain.
Too Many Hoops, Not Enough Hires
Recent studies show the time required to fill open jobs is on the rise—from 15 days in 2009 to 23 days today. Some attribute this trend to a widening skills gap, and others say economic uncertainty has instilled a fear of wasting company resources on a bad hire. While those are certainly contributing factors, there are more basic issues.
The notion of an endless supply of talent is feeding a lot of bad behavior in talent acquisition. Don’t get me wrong — there are plenty of people out of work in the U.S. But the gauntlets we’ve devised, the hoops we’re asking candidates to jump through, are getting a little out of hand, and we don’t have much to show for it other than disqualified (and disgruntled) candidates.
Though it’s important to assess competency, personality, and cultural fit, stretching an interview process out past three weeks is asking a lot of candidates. Remember that websites like Glassdoor gather data on your interview process; candidates don’t hesitate to share the good, bad, and ugly.
A College Degree Isn’t a Screening Tool
No one likes sifting through never-ending piles of resumes and applications to find the proverbial needle in the haystack. I get it—I’ve been there. That’s why many recruiters work with hiring managers to identify the minimum qualifications required for any applicant to be considered for the role. In theory, this reduces the number of potential candidates, thereby reducing the time spent screening and interviewing.
In practice, however, minimum qualifications have gotten a bit out of control. Too many employers are using degree requirements as a catchall for even the most menial jobs, and sending a clear message to those for whom college wasn’t an option: “We don’t want you.”
And for the candidates with $100,000 in student loan debt desperate for a steady paycheck, a college degree is quickly reduced to the most expensive (and useless) item on their list of qualifications.
For high-skill or high-impact jobs, assessing a candidate’s education along with his or her work experience makes sense. When hiring for the majority of entry- to mid-level jobs, however, there are far more important considerations for potential candidates.