Most Americans don’t want AI to have a say in whether or not they get a job.
That’s according to a recent Pew Research Center survey that polled around 11,000 adults about their attitudes toward algorithms in the workplace. Around two-thirds said they wouldn’t want to apply for a role if AI was used to make a hiring decision, citing worries about systemic bias and an absent “personal touch” when it comes to traits that transcend a resumé.
“It would lack or overlook the human factor,” one woman in her 40s told the surveyors.
The report comes as tools that automate the winnowing of candidate pools have become more widespread than ever. The Society of Human Resource Management found that nearly one in four organizations plan to start using or increase their use of AI tools in hiring and recruitment over the next five years.
“There is so much conversation right now around the potential impacts of AI in the workplace, and so many real-world implications for people,” Pew Research Associate Colleen McClain, one of the authors of the report, told Tech Brew. “We really wanted to bring Americans’ own voices into this conversation.”
Politicians and regulators have taken notice of the potential pitfalls of this use of AI. New York City will soon begin enforcing a law that requires any AI tools used in hiring to be audited for bias. Illinois and Maryland passed laws limiting facial recognition in interviews, and California and Washington, DC, are currently mulling regulations around the issue.
Public opinion seems to be largely on the side of these efforts, according to Pew’s research. About 71% of respondents said they opposed the use of AI in making a final hiring choice (7% were in favor, while 22% were unsure). Another 41% of people said AI shouldn’t play any role in reviewing applications (28% disagreed and 30% were unsure.)
Keep up with the innovative tech transforming business
Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.
While employers have long used tools that slim down piles of resumés by matching their text to certain keywords, newer generations of software use natural language processing AI to scan for more comprehensive candidate profiles or reach out to potential candidates via chatbots.
But AI experts have criticized many of these tools, citing built-in biases, lack of transparency, and other shortcomings. For instance, researchers at New York University audited two hiring tools claiming to perform AI personality tests and found that they could not “be considered valid personality testing instruments.” Another paper last year from Cambridge University questioned claims from HR services that their AI tools serve to “debias” the hiring process.
For Pew respondents who said they would want to apply to jobs that use AI in the hiring process, claims that these tools provide objectivity was a major draw. The vast majority of those in favor of the technology cited claimed that it would be unbiased and treat people more fairly. Around 79% of all respondents said racial and ethnic bias in hiring was a problem, and about 53% of that group thought that AI would improve the situation (13% said it would make it worse and 32% said it would stay the same).
While Americans are generally skeptical of AI’s use in hiring, McClain said, attitudes also vary across which stage of the process or specific use case the tools are addressing, with opinions often split on how capable they perceive the technology.
“I would say the overarching theme of our report is that Americans do express concern, they are worried about these things,” McClain said. “But throughout all of this, there are some positives that stand out as well.”