NYC Set to Enforce Automated Employment Decision Tools Law in July 2023

Image by Amtec Photos, https://www.amtec.us.com

The New York City Department of Consumer and Worker Protection just announced that it will begin enforcing the NYC Automated Employment Decision Tools (“AEDT”) law on July 5, 2023, delaying enforcement of a law which was originally introduced in 2021. Along with this announcement, the Department also published its Final Rule, which is intended to clarify the Department’s interpretation of the NYC AEDT law.

Firstly, what is an Automated Employment Decision Tool? As we wrote in more detail last year, these are automated tools that are generally intended to help employers pick the best job candidate from a pool of applicants, including by using machine learning, statistical models, or artificial intelligence (“AI”). This can involve, for example, automated systems that read through every applicant’s resume to determine which candidates to interview, systems that purport to analyze a candidate’s facial expressions or intonation during an interview, or programs that use candidates’ answers on personality tests or logic puzzles to rank their likelihood of success in the job. 

Secondly, why is New York City trying to regulate the use of these automated tools? Because New York City is trying to prevent race and sex discrimination in hiring. All of these automated tools rely on a particular set of inputs and then categorize them based on a series of criteria which are themselves biased. For example, if an algorithm or machine learning platform uses the resumes of historically successful candidates to determine which attributes correlate with successful performance, those algorithms would be using a data set that reflects the biases baked into society now as well as in the past. For example, such a data set may produce predictions that men are more likely to be successful than women, because of historical hiring biases or historical practices in which women dropped out of the workforce after getting married or having children. Or perhaps such a data set would result in a prediction that people whose names sound more “white” are likely to be more successful, because study after study has shown that people are more likely to call candidates back for a job interview or rate people’s abilities more highly when they believe (correctly or incorrectly) that the person is white. The automated tool would then “learn” that more white-sounding names correlate with someone being perceived as a better candidate or more successful employee. Moreover, the demographics of the people creating the software or algorithm also tend to affect the input data—if the engineers are all white men, they may tend to use a disproportionate number of white male resumes to train their algorithm or AI, resulting in yet more bias. Because these inputs become, in essence, laundered through the algorithm, people may lose sight of the fact that the automated tool’s outcomes may be discriminatory. When discrimination is invisible, it can be harder to root out or correct for.

Finally, what does the NYC AEDT law do? It essentially requires employers to inform potential applicants that they use automated employment decision tools to make decisions and also requires employers to hire an independent third party to audit the tool and analyze the impact of the automated employment decision tool—e.g., to determine whether the tool appears to demonstrate bias in its outputs. These audits are required to analyze the tool’s “impact ratio” for a number of different demographics, including race, ethnicity, and sex. The law also requires employers to post the findings of these analyses on their websites.

Will these audits and disclosures be sufficient to ensure that machine learning and AI employment tools do not unfairly discriminate against people because of their race or sex? Will the criticism of disability advocates, discussed in more detail here, that these automated tools will discriminate against people with disabilities (either by filtering them out or by surfacing disabilities too early in the hiring process, before candidates can request an accommodation) be borne out? We will have to wait and see. 

 

This article is intended as a general discussion of these issues only and is not to be considered legal advice or relied upon. For more information, please contact RPJ Attorney Christine Clarke who counsels clients on employment, labor, healthcare, housing, and civil rights law, as well as legal compliance for non-profit organizations; First Amendment free speech and constitutional due process claims; and discrimination dispute resolution and prevention trainings. Ms. Clarke is admitted to practice law in New York, as well as the U.S. District Courts in the Southern and Eastern Districts of New York, the Second Circuit Court of Appeals, and the United States Supreme Court. Attorney Advertising.