Dive Brief:
- Researchers at Penn State University and Columbia University developed an artificial intelligence (AI) tool that can detect discrimination based on protected attributes like race and gender in hiring, pay practices, policing, education admissions and consumer finance.
- Penn State said the research involved analysis of data, including salary, demographic and employment-related information, for about 50,000 people. While developing the AI, researchers saw gender bias reflected in the pay-related information in this dataset. For example, the odds of a woman earning more than $50,000 a year was one-third that of a man's, researchers said.
- "You cannot correct for a problem if you don't know that the problem exists," Vasant Honavar, one of the tool's developers, told Penn State News. "To avoid discrimination on the basis of race, gender or other attributes you need effective tools for detecting discrimination."
Dive Insight:
Penn State and Columbia developed their AI tool to detect discrimination, but machine learning algorithms can discriminate, too. Algorithms used by HR and talent professionals have the potential to perpetuate human biases if the data used to create those tools is reflective of those biases, experts have told HR Dive. For example, Google's attempt to develop a hiring algorithm last year duplicated a hiring bias against women in the organization.
"You could guess that a machine wouldn't have the same racism or sexism of an individual person, but if all the machine is doing is learning from the hiring decisions of an old manager, or thousands of old managers, then they're just going to make those same decisions, but faster," Daniel Greene, assistant professor of information studies at the University of Maryland, previously told HR Dive.
Despite the risk of inadvertent discrimination, algorithmic processes are essential for many organizations that want to speed up their time-to-hire, recruit at a high volume or simply want to streamline sourcing, screening and hiring processes. To better anticipate the potential for discrimination in algorithmically aided compensation and hiring, Greene said talent professionals can interrogate the service providers they contract with about how their machine learning solutions were trained to eliminate human bias. Employers might also use other software, such as SAP's technology, which detects hiring bias by analyzing language.