Concerns about artificial intelligence and other technology — and their intersection with employment law — dominated a Sept. 22 listening session with the federal government’s workplace discrimination watchdog.
The U.S. Equal Employment Opportunity Commission wrapped up its series of three listening sessions last week, bringing an end to public input as it develops its next strategic enforcement plan. While witnesses touched on a wide array of issues at the event, both employee and employer representatives asked the commission for clarity around emerging tech.
AI, tech as a means of perpetuating discrimination
Throughout the 5-hour-long event, worker advocates expressed concerns about hiring systems and hard-to-use technology inadvertently furthering discrimination.
Dariely Rodriguez, deputy chief counsel for the Lawyers’ Committee for Civil Rights Under Law, asked the EEOC to devote resources to address “biased algorithmic screens” in the hiring and recruitment processes, which she said often have an adverse effect on Black workers and other workers of color.
Representing the National Women’s Law Center, Emily Martin also asked the EEOC to focus on the issue, noting that such hiring technology can “replicate and systematize harmful and stereotyped decision-making, while also making such discrimination more difficult to challenge because of the black box nature of those decision-making processes.”
Holly Biglow, government affairs director on the financial security team at AARP, emphasized the ways algorithmic and screening technology can exclude older workers. She described application systems that require applicants to include dates of birth or graduation dates in fields that cannot be bypassed. Some sites even use drop-down menus with dates that only go back to a certain year — 1980, for example — and thereby exclude those who were born or attended college before that time.
Several witnesses praised EEOC’s Artificial Intelligence and Algorithmic Fairness Initiative, which the agency launched in 2021 to shine a light on AI discrimination issues — but they noted there’s more work to be done in this space. “We … urge the commission to increase the resources needed to effectively tackle these complicated issues by hiring staff with expertise in AI and assisting investigators in identifying discriminatory AI practices,” Biglow said.
Some noted that technology can be discriminatory beyond just the hiring and recruitment processes. “Many of the cases I'm encountering these days involve inaccessible employee-use technology,” said Eve Hill, disability rights attorney and partner at Brown Goldstein & Levy. She said many employers fail to consider accessibility when investing in new technology, but “buying inaccessible technology is not an excuse for excluding employees with disabilities and people with disabilities should certainly not be the ones who suffer the consequences of ill-advised purchasing decisions.”
A need for guidance
Employer advocates, represented by the Society for Human Resource Management and attorneys in employment law, expressed the need for more clarity on how employers can and should use AI in hiring and other situations.
Emily Dickens, chief of staff, head of government affairs and corporate secretary for SHRM, pointed to the prevalence of automation and AI in HR, with 79% of those polled by the organization indicating they used it or planned to use it in the next five years.
Dickens cautioned against heavy regulation of employer tech use. “Nearly 3 in 5 organizations report that the quality of their organization’s hires is higher due to their use of AI,” she said. “This is not the moment to impose heavy-handed regulatory restrictions that will set key HR functions back and impede the ability to create and identify talent pipelines.”
“We need more guidance” on AI, said David Fortney, co-founder of employment law firm Fortney & Scott, calling the EEOC’s previous efforts to advise on AI usage in compliance with the Americans with Disabilities Act “a good start.”
Fortney urged the EEOC to consider a forthcoming report from the Institute for Workplace Equality’s AI Technical Advisory Committee, which is expected in the coming months, and to work with other federal agencies to “try to get organized.” At a minimum, he said, “consistent application of the Title VII principles [is] imperative.”
Darrell Gay, partner at ArentFox Schiff, asked the EEOC to develop more AI educational materials and to reach out and collaborate with attorneys who represent employers. “We want to become partners with you guys,” he said.
Surveillance concerns spread
Beyond AI in hiring and accessible technology concerns, several witnesses who spoke at the event cited fears about use of surveillance. Judy Conti, director for government affairs at the National Employment Law Project, urged the EEOC to develop guidance to ensure employers do not use surveillance tech in a discriminatory fashion.
Conti noted the surveillance topic is of particular concern for industries like warehousing, retail and hospitality. “We know, for example, there are warehouse workers that are made to wear devices on some part of their body during the day that continually tracks them to make sure they're hitting pre-established targets,” she said. Employees in these situations are sometimes fired for not meeting these targets, and even avoid going to the bathroom or taking other breaks out of fear. “There’s a real imbalance of power,” she said.
Surveillance tech can also be used disproportionately against particular groups, witnesses noted. “We know from research studies … that there is often more distrust, for example, that Black workers won't work as hard as White workers, or won't work as hard as, let's say, Latinx workers, so they may be subjected to even more extreme surveillance,” Conti said. Such targeted surveillance can lead some workers to suffer in ways others don’t, as their mistakes receive greater attention, she said.
Workers with disabilities also can find themselves unfairly targeted, Hill said. Surveillance systems like facial recognition, eye tracking and mouse tracking can disadvantage those “who look, move or work differently because of a disability,” she said. “So the EEOC should address the need to provide accommodations in those systems.”
Throughout the event, EEOC commissioners asked several questions about how to improve the agency’s AI resources and acknowledged the importance of continuing to develop tools for employers and workers alike.
The commission said the listening session — along with its first sessions held Aug. 22 and Sept. 12 — will inform the EEOC’s strategic enforcement plan for fiscal years 2022-2026.