Underrepresented groups are more likely to predict AI will positively impact DEIB goals than the average respondent—they are also more likely to have concerns.

Key takeaways

  • Indeed research finds that HR/TA leaders and job seekers believe AI will have more of a positive impact on diversity, equity, inclusion, and belonging (DEIB) goals than a negative one.
  • HR/TA leaders and job seekers from underrepresented groups are more likely to anticipate both positive and negative impacts of AI.
  • Experts say AI should be a companion for HR/TA leaders, not a decision-maker.

In a recent Indeed-commissioned study to determine what HR/TA leaders and job seekers think about AI and how it’s impacting hiring, a surprising finding emerged. The Indeed Global AI Survey found that respondents from some underrepresented groups are, in many cases, more likely to predict that AI will positively impact diversity, equity, inclusion, and belonging (DEIB) goals than the average respondent. 

For example, globally, job seekers living with disabilities are more likely to say that AI will have a positive impact on people living with disabilities rather than a negative impact (35% vs. 20%), and they’re more likely to anticipate that positive impact than job seekers overall (35% vs. 27%). The survey found the same dynamic among HR/TA leaders living with disabilities. 

The same trend also played out among HR/TA leaders and job seekers who identify as gay, lesbian, and bisexual, as well as job seekers who are Black or African American—they are more likely to anticipate that AI will help rather than harm DEIB goals. And they’re more likely to see AI as a positive force when it comes to DEIB in the workplace.

And yet, the opposite is also true. Even though job seekers who identify as gay or lesbian are more likely to anticipate that AI will have a positive rather than a negative impact on 2SLGBTQ+ issues at work, they’re also nearly twice as likely as the average respondent to worry that the impact of AI on LGBTQ+ issues will be negative (22% vs. 12%).

At a glance, these findings from the survey, which included more than 7,000 HR/TA leaders and job seekers from seven countries around the world, may appear in conflict. But according to Jessica Hardeman, Indeed’s Global Director, Employee Lifecycle, it makes sense. Historically, marginalized groups are more attuned to both the potential and the risk of AI in the workplace. 

At its best, Hardeman says, AI can unlock new accessibility tools for people living with disabilities and help companies train workers in different languages and cultural contexts more quickly. But at its worst, AI can entrench existing biases against those same underrepresented groups. “Are you using AI for good or not?” Hardeman says. “I think that’s what it comes down to.”

Here’s a look at how AI can help you advance your company’s DEIB goals—and how you can avoid common traps that risk setting them back.

How AI Can Further Your DEIB Goals…  

When implemented correctly, AI tools can be pivotal partners for DEIB leaders. Hardeman says Indeed uses AI to help review job descriptions to ensure that the language is inclusive. Her team also uses generative AI to develop animations and voiceovers for training content tailored to different countries. 

AI tools can also help unearth problems related to discrimination that are entrenched within a company. Diversio, a company that helps businesses track, measure, and improve their DEIB goals, uses natural language processing to sift through open responses to employee surveys and identify issues, including those that are concentrated among certain demographics. 

For example, if the analysis finds that concerns about flexibility are concentrated among female employees, Diversio might propose implementing “core hours,” which means not scheduling meetings during times when parents take kids to and from school.

Laura McGee, CEO and founder of Diversio, says AI also has the potential to minimize the impact of interpersonal relationships that sometimes weigh too heavily into promotion decisions. “So often, advancement in companies is based on relationships and not work product,” she says. With AI assessments, workers can be judged based on what they produce, “instead of who they went to the baseball game with,” McGee says.

These tools don’t just provide more equitable opportunities for employees; they can also help executive teams that “don’t always have enough information” about their staff, says Jenn Tardy, founder and CEO of the DEI training and consulting firm Jennifer Tardy Consulting. “Using AI to identify people within your organization who could be ready for advancement can create more diverse internal pools of candidates for future opportunities,” she says.

…and How AI Can Undermine DEIB Progress 

AI tools are trained on extensive data, including data that can contribute to stereotyping or exclusion. Even the simplest tasks you entrust to AI will be influenced by those inputs. 

HR/TA leaders understand this. According to Indeed’s survey, more than half (53%) say they’re concerned about bias in AI training data. “AI is learning from us. And the voices of us are not equitably distributed or represented,” says Andrés Tapia, Senior Partner and Global DE&I and ESG Strategist at the consulting firm Korn Ferry. 

Even something as straightforward as asking a generative AI tool to develop best practices for interviewing may be susceptible to bias. “How much of that is going to be influenced by the dominant group?” Tapia says. Depending on the role, those interview tactics could be unintentionally skewed toward groups that have historically been part of those interview conversations to begin with.

Another risk, Hardeman says, is that over-reliance on AI “can create complacency.” For instance, an AI tool may show that a company is hitting its DEIB goals without asking deeper questions about where people of colour actually sit in the corporate hierarchy. 

Tips for Using AI Responsibly to Support DEIB Goals

To help ensure your organization is maximizing the benefits of AI when it comes to DEIB—and minimizing some of the pitfalls—there are a few steps you can take:

  1. Notice how AI vendors address DEIB concerns in their own organizations. Look for vendors that are willing to share information about their own diversity and impact. “Say, ‘What does your engineering team look like?’ If it’s all the same profile, I think that should raise some red flags,” McGee says.
  1. Don’t fall for “skill proxies.” Be wary of resume filtering tools that place too much emphasis on where someone went to school or what degree they received. These are “skill proxies,” Tardy says, which aren’t necessarily indicative of actual skills.
  1. Keep humans in the loop. AI tools are no replacement for human beings’ lived experiences, Hardeman says. These tools should be a guide, not a replacement, for that human judgment.

Ultimately, Hardeman says, AI can be a useful resource like any other, as long as it’s not viewed as the sole source of truth. “AI should enable you to make decisions,” she says, “not make the decisions for you.”