Dive Brief:
- The New York City Department of Consumer and Worker Protection has announced that it will postpone enforcement of a new law intended to regulate the use of automated employment decision tools until April 15, 2023.
- The DCWP said it was delaying enforcement of Local Law 144, which takes effect Jan. 1, due to a high volume of public comments. The department held a public hearing on the law in November but said it would schedule a second public hearing at a time to be determined.
- Local Law 144 would require employers to conduct bias audits on automated employment decision tools, including those that utilize artificial intelligence and similar technologies, and would require employers to provide certain notices about such tools to employees or job candidates who reside in the city.
Dive Insight:
NYC’s law drew attention in the HR space because of its unique provisions. While Illinois previously passed legislation regulating employers’ use and analysis of video interviews, Local Law 144 is the first piece of U.S. legislation to target the use of AI and other automated technologies throughout the hiring process.
The DCWP’s said the law drew public comments from prominent stakeholders both within and outside of the HR profession.
For example, the Society for Human Resource Management submitted written comments praising the DCWP’s efforts while also asking for further clarification, particularly with respect to how the law defines automated employment decision tools as well as the requirements for bias audits, such as the scope of the candidate pool to be tested.
Others, like Indeed, asked DCWP to clarify whether the audit provisions require employers to conduct selection rates and impact ratios for each intersectional demographic category and whether sample size metrics would need to be included. LinkedIn suggested revisions to the law’s automated employment decision tool definition, among other points.
Meanwhile, written comments from the New York Civil Liberties Union said that Local Law 144 “falls far short of providing comprehensive protections for candidates and workers.” NYCLU asked the DCWP to strengthen its proposed rules to ensure broader coverage of automated tools, expand bias audit requirements and provide transparency and notice to affected people.
AI and algorithmic decision-making tools have been a fixture of regulatory agendas in 2022, both in NYC and at the federal level. In May, the U.S. Equal Employment Opportunity Commission and the U.S. Department of Justice published documents cautioning employers to avoid “blind reliance” on AI in hiring, performance management, pay determinations and other conditions of employment.
Sources previously told HR Dive that employers may want to re-evaluate how their assessment tools perform regarding tasks such as speech pattern measurement and keyboard input tracking to ensure that certain job candidates are not placed at a disadvantage.