“Open the [Office] Door, HAL” DOJ and EEOC Issue Guidance on Employers’ Artificial Intelligence Discriminating Against Individuals with Disabilities

by David Warner, Partner

  • Centre Staff, Legal Alerts

Last month the U.S. Department of Justice (DOJ) and Equal Employment Opportunity Commission (EEOC) issued guidance regarding employers’ increasing use of algorithms and artificial intelligence (AI) in sourcing and selecting candidates for employment and promotion. The DOJ enforces disability discrimination laws with respect to state and local government employers, while the EEOC enforces such laws with respect to private sector employers and the federal government.

The central thrust of both the DOJ guidance and EEOC guidance is that employer obligations concerning reasonable accommodation of individuals with disabilities must be considered when algorithms or AI are utilized during a selection process. Both acknowledge that AI is increasingly used by employers to “save time and effort, increase objectivity, or decrease bias.” The concern, however, is that in achieving those goals employers may risk violating federal law where the tools disadvantage job applicants and employees with disabilities.

The EEOC’s technical guidance identifies three common ways an employer’s use of algorithmic decision-making tools might violate the ADA:

  • An employer failing to provide a reasonable accommodation that is necessary for a job applicant to be rated fairly and accurately by the algorithm;
  • An algorithmic decision-making tool intentionally or unintentionally screening out a disabled individual even though the person could perform the job in question with a reasonable accommodation; and
  • An algorithmic decision-making tool that violates restrictions on disability-related inquiries or medical examinations.

With respect to accommodation, the EEOC’s identifies as a “promising practice” employers informing applicants of the use of AI and an invitation to request accommodation when needed. As an example, the EEOC cites an applicant with limited manual dexterity might have difficulty taking a knowledge test that requires the use of a keyboard, track pad or other manual input device. Particularly if the test is timed, it will not accurately reflect the applicant’s actual job knowledge. The EEOC continues to note that, if it is not possible to make the test accessible, the employer would need to consider an alternative test of job knowledge, barring undue hardship – i.e., a high bar. Further, the accommodation requirement cannot be avoided by outsourcing job screening to a third party as an employer would be liable for its agent’s discriminatory conduct.

Concerning intentional or inadvertent screening out of candidates, the EEOC cites the ever-increasing use of a “chat-bot” as potentially problematic. The agency continued:

A chatbot might be programmed with a simple algorithm that rejects all applicants who, during the course of their “conversation” with the chatbot, indicate that they have significant gaps in their employment history. If a particular applicant had a gap in employment, and if the gap had been caused by a disability (for example, if the individual needed to stop working to undergo treatment), then the chatbot may function to screen out that person because of the disability.

Another example highlighted by the EEOC is employers’ use of video interviewing software that analyze applicants’ speech patterns. There, if an applicant’s speech impediment resulted in a low or unacceptable rating, the ADA would potentially be implicated.

The EEOC technical guidance provides several recommendations for employers utilizing algorithmic decision-making tools within selection processes. First, vendors providing such tools should be questioned as to whether the tool was developed within disabled candidates in mind – e.g., did the vendor assess whether any traits or characteristics measured are correlated with certain disabilities. Second, employers are encouraged to take proactive steps to reduce the likelihood of disabled individuals being screened out by such tools. These include:

  • clearly indicating that reasonable accommodations, including alternative formats and alternative tests, are available to people with disabilities;
  • providing clear instructions for requesting reasonable accommodations; and
  • in advance of the assessment, providing all job applicants and employees who are undergoing assessment by the algorithmic decision-making tool with as much information about the tool as possible, including information about which traits or characteristics the tool is designed to measure, the methods by which those traits or characteristics are to be measured, and the disabilities, if any, that might potentially lower the assessment results or cause screen out.

The DOJ and EEOC guidance are a shot across the bow for larger employers which are more likely to utilize algorithmic and AI selection tools to make selection processes more efficient. As these technological tools become more prevalent, the potential compliance implications for employers will continue to expand.

In sum, the ADA applies to selection decisions regardless of whether the decisions are being made by your hiring manager or your Chat-bot!