December 23, 2019
No items found.

Artificial Intelligence May Make HR's Job Easier, but Employment Discrimination Still Abounds

Proponents of artificial intelligence (AI) and machine learning have promised tools will usher in a new age of digitized work. AI tools can now be used scan thousands of documents and reduce repetitive work tasks, algorithms can predict your shopping habits and recommend products before you even think of them, and machine learning software can be trained to identify cancer from MRIs. Often, the creators and designers of these tools tout AI's supposed objectivity. However, what technologists are less interested in publicizing is how AI can be used to reinforce discriminatory policing, violate civil rights, enable employment discrimination and reinforce class, gender, and race disparities

In a recent New York Times op-ed, Dr. Ifeoma Ajunwa of Cornell's Industrial and Labor Relations School highlighted hiring companies and HR departments increased use of these tools. Ajunwa points out that employers are not merely utilizing these technologies to screen candidates, but  are actively barring candidates from being considered for employment. As an example, she posits a company that relies on a hiring algorithm trained to seek candidates without gaps in their employment. Ajunwa notes, such a stipulation would automatically screen out women applicants who have taken time off for child care or for those who have had long-term medical issues. And, because AI relies on specific rules created by humans, there is no way for the technology to check itself against employment law or ethical norms about employment discrimination. It would simply filter out applicants who don't meet the criteria.

Dr. Ajunwa is not the only one sounding the alarm about employers increasing reliance on AI and other tools, which creators purport to be objective. According to Cathy O'Neil, author of Weapons of Math Destruction, such algorithmic bias is common in hiring, especially in low-wage jobs where massive retail companies rely on sophisticated AIs that consider aspects of your life you would not think have any bearing on employment, such as your credit score, medical and mental health histories, personality tests, and driving record.

In recent years, several lawsuits and investigations regarding AI discrimination have appeared and several researchers in tech have started to develop methods to illuminate the hidden bias in machine learning and AI technologies. However, as Dr. Ajunwa notes, there are few concrete laws on the books that can protect applicants from algorithmic discrimination. Moreover, the Harvard Business Review cautioned that unlike other forms of employment testing, many of these AI-based tools remain empirically untested, leaving the door open to to ethical and legal problems.

It Pays to Listen to Your Employees

April 18, 2022
Disability Discrimination
A Kentucky jury’s recent finding underscores how important it is to listen to employee’s needs, especially when employees are sharing the mental health bases for their requests. Such open-minded attitudes and awareness of the consequences of disability discrimination usually lead to less strife and more equity in the long-run.

Two Years In, NWLC Releases Sobering Study on Women’s Employment

April 7, 2022
No items found.
While the disastrous recession that accompanied the first wave of global lockdowns has receded, women’s employment in the US remains in a dire place, according to a new study by the National Women’s Law Center.

Confirmation Hearings Descend into Farce as Nominee Ketanji Brown Jackson Remains Steadfast

April 1, 2022
No items found.
With an unimpeachable public record, Kentanji Brown Jackson’s Senate confirmation hearings, predictably, veered into farce as Senate Republicans grandstanded for cable news, trotting out various electoral bogeymen, especially Critical Race Theory, and tried to smear Jackson by association.

Get In Touch

Knowing where to turn in legal matters can make a big difference. Contact our employment lawyers to determine if we can help you.