General counsel have six months to prepare for a New York City law that requires them to test any automated tools their organization uses in the hiring or employee evaluation process for bias and to develop a protocol for letting people that don’t want to be subject to the tools to opt out.
Although the law applies to New York City job applicants and employees, other laws that aim at automated evaluation tools, both for hiring and assessing performance, are picking up steam across the country, including in California and Illinois, among other states. That makes the New York City law a good one to comply with to the extent it helps prepare organizations as more automated-tool bias laws get passed, says Fisher Phillips Partner David Walton.
“It’s an easy concept for other states and big cities to copy,” Walton said in an interview. “They’ll just use their statute. When a situation like this occurs, it helps to try to comply with the toughest law.”
Unanswered questions
The New York City law, passed just as the council’s last session was ending, has generated confusion because many of its provisions raise questions that are unlikely to be answered before it takes effect at the beginning of 2023.
“It was kind of rushed through,” Walton said. “It’s not a well-defined statute.”
Among other things, it requires organizations to have a bias audit conducted each year on their automated hiring and performance evaluation tools and to post on their website that the tools are in use and to give people 10 days to opt out of being subject to the tools.
The opt-out raises tough questions for the organization, Walton said. “What’s that alternative selection process?” he said. “What is an accommodation?”
Also unclear is what the company is supposed to do during the 10-day notice period. “Does that mean you have to post a notice somewhere that says, If you’re submitting a resume for this job, we’re going to use an automated decision tool and have to wait 10 days [before you can talk to them]?” he said. “I don’t know.”
Cottage industry
The requirement for organizations to have an outside firm conduct a bias audit on their tools is another potential stumbling block, because few firms today have the necessary mix of expertise to detect bias in an automated tool’s algorithm.
“The algorithms are complicated,” Walton said. “We treat them as a black box. Those algorithms, depending on how they’re developed, can have a discriminatory bias.”
An example is whether a candidate lives in a rural or urban area. If the algorithm gives extra weight to a rural candidate, because statistics show a rural candidate stays at a job longer than an urban candidate, that could be one factor in suggesting a bias against a protected class.
“Of course, people from urban areas are more likely to be in a protected category,” he said. “That’s just the reality of the world, so if you have an algorithm that slightly favors people in rural areas, could that be the basis for a disparate impact case?”
Walton expects a cottage industry of firms specializing in analyzing bias in automated tools to spring up as more jurisdictions follow New York City’s lead and pass similar laws.
“Not a lot of firms do bias audits,” he said. “And then, if plaintiffs’ attorneys think there’s enough money involved, they’ll develop their own set of experts to attack the bias audits. That’s where you can have major growth in disparate impact cases.”
Widespread use
If automated-tool bias laws do start to spread, they are likely to have a major impact on most employers, because something like 80% to 90% of HR departments use some type of automated tool today, Walton said.
In many cases, the tool is to help HR departments screen large numbers of resumes for those that show the most promise for the mix of skills the employer is looking for.
“If you’re a massive employer, with thousands of employees, you’re going to use some AI tool to go through resumes,” he said. “People are submitting applications every day.”
What’s unclear is how the law will treat third-party tools that attract job seekers and then filter candidates before sending their resumes to organizations with job openings.
“Does that mean every employer that uses [one of these third-party sites] is potentially subject to a lawsuit if there’s a bias found?" he said.
Testing under counsel
Walton recommends companies, when they bring in an outside firm to conduct a bias audit on their automated tools, to have the work done under counsel, at least for the first time, as a way to keep the process and the results under attorney-client privilege.
“Who knows what the audit is going to say?” he said. “What happens if some algorithm you’re using has a potentially illegal bias? You may be using something some data expert will say there’s unintentional bias against a protected category, and you're going to want to keep that process as much as you can under attorney-client privilege. That’s not a guarantee you won’t be subject to a lawsuit, but it can help you increase protections from a risk management standpoint.”