IE 11 is not supported. For an optimal experience visit our site on another browser.

Job hiring increasingly relies on personality tests, but that can bar people with disabilities

Using automation runs the risk of screening out people for attributes that have nothing to do with how they would perform their jobs.
Silhouette of person using laptop, behind glass
In countless cases, disabled applicants don’t get the chance to nail an interview because they’re being rejected before they interact with a human being.Fang Zhou / Getty Images/Cultura RF

Taking a personality test, like one of the quizzes that proliferate online, can be a fun way to learn about yourself and explore what makes you tick. But the HBO Max documentary “Persona: The Dark Truth Behind Personality Tests,” debuting Thursday, reveals a troubling side to the way corporate America uses these tests. One of the most disturbing things is how the tests discriminate against an entire class of people.

The film, in which I participated, looks at employers’ increasing reliance on tools that, for example, ask candidates to rate their agreement with statements such as “You have confidence in yourself,” “You are always cheerful,” or “It is easy for you to feel what others are feeling.”

These types of questions may make for good conversation among friends, but when employers use them to decide who gets a job interview, it’s a recipe for discrimination against people with depression and other disabilities. As the documentary explores, it’s also junk science to assume that people’s self-reported moods or confidence accurately correlate to their future job performance.

Yet, personality tests such as these have become standard practice for employers screening job applicants. Proponents argue these tests communicate organizational “fit” and may be fairer than hiring decisions based on factors such as which schools candidates went to, which can be influenced by economic status or race. Already, more than 75 percent of large companies use assessment tools such as aptitude and personality tests to screen applicants.

These tools may be useful in making the hiring process more efficient and allowing companies to hire at scale. But they also run the risk of leaving people with disabilities out in the cold, screening them for attributes that have nothing to do with how they would perform their jobs.

And personality tests are only the beginning. Other hiring tools include resume screeners, which review candidates’ CVs for desired keywords, e.g. leadership on a sports team; sentiment analysis tools, which purport to analyze candidates’ movements during video interviews; and game-based tests, in which a candidate’s performance during an online game is compared to the performance of existing employees at the company.

A new report by the Center for Democracy and Technology details the ways that many of these artificial intelligence-driven screening tools may unfairly exclude candidates with disabilities. I serve as a fellow at the center working on disability issues, drawing in part from my own experience as a wheelchair user navigating systems that aren’t always adequately attuned to the needs of those with disabilities.

According to the report, in countless cases disabled applicants don’t get the chance to nail an interview because they’re being rejected before they interact with a human being. This is often because algorithms used in personality tests and other screening approaches exclude the tail end of a statistical bell curve generated by a large pool of applicants answering the same questions. Because disabled people function differently than the average, they are more inclined to populate these "tails,” with the AI thereby leading to discrimination.

For example, consider a highly qualified accountant on the autism spectrum denied a position because she didn’t make good eye contact during a recorded video interview. Or consider a person with a history of depression applying for a customer service job only for the test to flag the applicant on how a question was answered about "energy level" during the day. In both instances, the reasons the candidates are being screened out don’t speak to their ability to perform the job.

The problem for employers goes beyond the loss of qualified candidates. Using discriminatory hiring tools is also likely to put them on the wrong side of the Americans with Disabilities Act and other civil rights laws. The ADA bans hiring processes that discriminate on the basis of disability, and requires employers to judge candidates on their ability to perform the job in question — not their ability to meet some abstract test.

Businesses should expect to be challenged in court as disability rights advocates start to scrutinize the application of automated tools that screen out disabled applicants based on things that aren’t directly related to the essential functions of a job.

Policymakers are taking notice and beginning to weigh in at the local and national levels. A group of 10 senators has written to the Equal Employment Opportunity Commission asking it to investigate and study AI hiring assessment technologies. Meanwhile, the New York City Council is rightly considering legislation banning AI hiring tools that have not undergone an annual civil rights audit.

But we need more than regulation. We need employers, technologists and disabled people to make sure the hiring and retention of employees don’t rely on flawed algorithms that inadvertently or intentionally result in disability discrimination.

They can begin by designing hiring tools that only measure essential functions of particular jobs, taking into account the alternative ways that disabled people can carry out job-related tasks. This includes anticipating how candidates’ disabilities may inadvertently cause them to perform poorly in hiring tests, and ensuring their abilities can be measured in a fair way.

This approach will help achieve many employers' stated goal of embracing diversity and inclusion practices for their workplace. Doing so would also allow employees to bring their authentic selves to a welcoming workplace that is eager to benefit from their contributions.

The broader civil rights community has published principles to “guide the development, use, auditing and oversight of hiring assessment technologies, with the goals of preventing discrimination and advancing equity in hiring.” The aim is for the EEOC to issue comprehensive guidance that addresses the kind of disability-specific concerns raised in the Center for Democracy and Technology’s report. And the Department of Justice should open investigations of disability discrimination of those applicants who are treated unfairly by AI-driven tools.

When the nation finally gets the Covid-19 pandemic under control, the United States will hopefully go on a historic hiring spree. Millions of new jobs could need to be filled. As employers recruit this massive workforce with the help of new tech tools powered by advances in AI, we must make sure not to erect new barriers that prevent millions of eager and talented Americans with disabilities from participating in it.