May 14, 2024

5 Common Mistakes Using AI in Talent Acquisition (Plus Actionable Solutions)

5 Common Mistakes Using AI in Talent Acquisition (Plus Actionable Solutions)

Rebecca Noori

Artificial intelligence (AI) is recruitment's new kid on the block, introduced to speed up everything from writing job descriptions and ads to sourcing, screening, and shortlisting potential candidates.

We’ve already explored how recruitment teams can use AI for better results, but now we’ll address some of the technology's pitfalls. This article explores five common mistakes associated with using or overusing artificial intelligence in the hiring process. Each comes with its own actionable solution.

Mistake 1: Not vetting the AI platform 

AI-based recruitment platforms haven’t always had a great reputation. In 2014, Amazon began to develop its own hiring machine to speed up the time to identify and hire the most qualified candidates. However, the company quickly realized its platform was biased against women. 

The problem? Amazon's platform had been trained to screen applicants based on resume patterns over a ten-year period. Most resumes belonged to male applicants, so Amazon's AI algorithm deciphered that hirers preferred them over female applicants. The technology penalized resumes that included the word "women's" or that featured any all-female colleges.

While Amazon retired its recruitment platform in 2018, its experience taught the recruitment landscape the importance of thoroughly vetting an AI platform before using it. Bias can creep in for two main reasons:

  • Training data bias: Like Amazon’s story, if the algorithm has been fed data based predominantly on white males, we can expect discrimination against women or people of color.

  • Programming errors: Developers can inadvertently introduce their own bias into the platform. A logic error can result in incorrect behavior or output, such as the algorithm prioritizing potential employees unfairly or overlooking qualified applicants. 

Solution

Before you slide a recruitment platform into your workflow, ensure you understand its background and how the technology has been trained. 

Avoid algorithm bias by using a trusted AI platform like Juicebox, trained on a comprehensive dataset of 30 separate sources. Personal data and personally identifiable information aren't relevant to the algorithm, which prioritizes skills, work experience, and education to return the most accurate candidate matches. 

Mistake 2: Failing compliance 

As AI-based hiring platforms become increasingly popular, legal and ethical concerns have been raised. One question is whether AI usage in the recruitment process complies with local, state, and federal regulations, including anti-discrimination laws.

One particular AI recruitment lawsuit saw the Equal Employment Opportunity Commission (EEOC) go head to head with iTutorGroup, a company found guilty of violating the Age Discrimination in Employment Act. Its AI hiring platform automatically rejected female applicants aged 55+ and male applicants aged 60+. As a result, iTutorGroup: 

  • Paid $365k to the rejected candidates

  • Adopted anti-discrimination policies 

  • Conducted training to align with equal employment opportunity laws. 

Solution 

This is one costly example of what can happen when recruiters don't meet regulatory requirements. Recruiters and employers must understand the specific laws that apply to them, including: 

  • Federal laws such as “Title VII of the Civil Rights Act of 1964,” which prevent employment discrimination based on race, color, religion, national origin, pregnancy, sexual orientation, etc. 

  • State and local laws, such as New York City's Automated Employment Decision Tools (AEDT) law, which prohibit employers from using such a tool for hiring or promotion decisions unless it has been independently audited for bias.  

Mistake 3: Not protecting candidate data 

Any tool that collects, stores, or regurgitates candidate data also falls under the compliance umbrella. Depending on your location and your job applicants' location, you may be impacted by stringent data protection regulations such as the EU's General Data Protection Regulation (GDPR) or the California Consumer Privacy Act (CCPA)

Solution 

Hiring teams can do three things to stay on the right side of data protection laws: 

  • Data minimization: Not all data is relevant for hiring decisions, so don't ask for it. For example, you don't need a candidate's bank account information at this stage in the recruitment funnel. 

  • Data consent: Obtain explicit consent from candidates to collect their data and explain precisely how you plan to use it. Give them an opt-out if they're not comfortable with your AI tool. 

  • Data encryption: Keep data safe against theft or unauthorized access using the latest encryption and security technologies.

Mistake 4: Eliminating human connection 

Artificial intelligence and machine learning promise to speed up your hiring process by eliminating repetitive manual tasks that consume too much time. Some typical solutions for laborious work include:

  • Chatbots: The technology generates responses to candidate questions 24/7, so applicants don’t need to wait for human recruiters to come online.  

  • Interview scheduling: Candidates choose and book a relevant time slot based on scheduling information provided by the hiring panel. This approach eliminates endless email chains and phone calls to find a time that works for everyone. 

  • Video interviews: Often used during interview rounds, candidates record themselves answering questions. AI recruitment tools analyze and score the recording based on facial and voice responses. 

While technology enhances many aspects of the hiring process, it's critical to maintain the human interaction between candidates and hiring teams. A survey by Resume Builder reported that 4 in 10 companies would let AI talk to candidates, and 1 in 7 would let the technology make the final decision about candidates.

Workplace’s Lesley Couper asserts that this is a problem: 

“Recruiting is fundamentally about people connecting with people. Technology can enhance our capabilities but cannot replace the instinctive ability to recognize potential, inspire candidates, or build meaningful relationships. It’s important to find the right blend of humanity and innovation in our recruitment strategies for a future that values both efficiency and empathy.” 

Solution 

Strike the right balance between technology and the human touch. If you aim for AI to replace every part of your recruiting process, this will likely damage the candidate experience, and repel high-caliber talent from joining your ranks. Learn when to deploy technology to gain efficiencies, such as during candidate sourcing, and providing assessment feedback. But then weave plenty of opportunities for interaction with a human recruiter into your hiring strategy.

Boutique recruitment agency consultant Sarah Dack shares, 

“I make a deliberate effort to meet face-to-face with each and every one of my candidates. In my experience, this hands-on approach is not just about ticking a box—it's about truly getting to know people, building trust and recognizing their strengths beyond what's on paper. These meetings lay the foundation for authentic, long-lasting relationships that benefit both the candidate and the hiring process.”

Mistake 5: Ghosting candidates 

Candidates invest time, energy, and travel costs in applying for jobs. In many cases, it can feel like they're edging toward the finish line, especially if they've attended multiple interviews and met with several key people.

But suddenly, everything goes quiet. The recruiter starts being vague, not providing the feedback they’d promised, and soon they stop responding to communication at all.

This type of candidate ghosting is on the rise, with 80% of hiring managers admitting to ghosting applicants at some point during the recruitment process. While ghosting is a recruitment trend not confined to AI, the technology can make it easier to lose sight of the human side of hiring and how candidates should be treated.

Solution

Candidates are more than data punched into a machine; they're individuals with feelings and aspirations. Being ghosted can damage a candidate’s confidence and negatively impact your employer brand.

If you’re using AI for candidate communication, introduce transparent processes to provide feedback, even if it’s automated. When you must reject a candidate, do so with empathy. Provide constructive feedback that supports their future job search and ensures the candidate's experience remains positive even when unsuccessful. 

Choose Juicebox: An ethical AI recruitment platform 

Ready to build a smooth, efficient, and ethical AI recruitment workflow? Juicebox is the perfect addition to your tech stack. 

Driven by People-GPT, our platform allows recruiters to “speak” to Juicebox to source candidates for their open positions. Using natural language, you’ll describe the ideal person you’re looking for in terms of skills, experience, or companies they’ve worked for, and Juicebox will return a list of potential matches. 

Get started with a free Juicebox trial, or take a product tour to see our AI sourcing tool in action.