Heard Around Town

NYC’s law to prevent artificial intelligence bias in hiring is in limbo

The implementation of the 2021 law has been delayed until April, while the city continues to consider how to define automated hiring tools and other draft rules.

While they can result in savings for employers, artificial intelligence hiring tools can also replicate racial and other biases in the hiring process.

While they can result in savings for employers, artificial intelligence hiring tools can also replicate racial and other biases in the hiring process. Zia Soleil

AI used in hiring can be racist. A New York City law passed in 2021 was supposed to help prevent that.

But as the law’s contours are still being worked out, some of its supporters warned that it’s at risk of being made toothless due to the influence of business groups.

Local Law 144, passed by the City Council in 2021, will require that so-called “automated employment decision tools” used by employers in the hiring process undergo an independent bias audit before being used. Employers using the tools will have to disclose their use to job candidates who live in New York City, and make the results of the bias audit publicly available. 

While they can result in savings for employers, these tools – which might include artificial intelligence or algorithms used to screen resumes or assess candidates in video interviews – can also replicate racial and other biases in the hiring process. That’s what prompted the passage of the law requiring bias audits to create more transparency about how employers use the tools and how they might prove discriminatory.

A coalition of advocates including The Black Institute’s Bertha Lewis and the Rev. Kirsten John Foy said in a letter to Mayor Eric Adams on Friday that the business response to the law has been “combative and reactionary.”

The law was scheduled to go into effect this month, but the city’s Department of Consumer and Worker Protection delayed implementation until April, citing a high volume of public comments in the rulemaking process. Open questions include what kind of tools qualify and what constitutes an “independent auditor.” 

Some business groups and companies have advocated for “automated employment decision tools” to be narrowly defined in the law so as to not be overly burdensome for employers – requiring bias audits for background checks, for example. But civil rights and privacy advocates have warned that doing so will create loopholes for companies. 

“As your administration continues to do the hard work of implementing this groundbreaking legislation in the near term, you will undoubtedly continue to hear complaints from businesses who are resistant to change,” the coalition of advocates said in their letter to Adams on Friday. “We strongly urge you to stand firm in insisting that Local Law 144 will not be diluted during the rulemaking process or at any time after the law goes into effect.”

On Monday, the Department of Consumer and Worker Protection held a hearing for testimony on its updated draft of rules. Several speakers at the hearing said that the definition of “automated employment decision tools” is too narrow and would exempt many automated tools that employers use. As currently written, automated hiring tools would face a bias audit if they “substantially assist or replace discretionary decision making.” Some speakers said that creates an excessively high bar, noting that these tools are rarely used to fully replace human decision making.

“If regulators do not revisit this language, the reach of Local Law 144 may be diminished, and the law will be less effective at reducing racial bias in hiring,” City Council Member Selvena Brooks-Powers said in her testimony at the hearing, adding that she and her council colleagues voted to pass the law in part to stop employers from “hiding behind vague claims about their commitment to workforce diversity.”