December 23, 2019
No items found.

Artificial Intelligence May Make HR's Job Easier, but Employment Discrimination Still Abounds

Proponents of artificial intelligence (AI) and machine learning have promised tools will usher in a new age of digitized work. AI tools can now be used scan thousands of documents and reduce repetitive work tasks, algorithms can predict your shopping habits and recommend products before you even think of them, and machine learning software can be trained to identify cancer from MRIs. Often, the creators and designers of these tools tout AI's supposed objectivity. However, what technologists are less interested in publicizing is how AI can be used to reinforce discriminatory policing, violate civil rights, enable employment discrimination and reinforce class, gender, and race disparities

In a recent New York Times op-ed, Dr. Ifeoma Ajunwa of Cornell's Industrial and Labor Relations School highlighted hiring companies and HR departments increased use of these tools. Ajunwa points out that employers are not merely utilizing these technologies to screen candidates, but  are actively barring candidates from being considered for employment. As an example, she posits a company that relies on a hiring algorithm trained to seek candidates without gaps in their employment. Ajunwa notes, such a stipulation would automatically screen out women applicants who have taken time off for child care or for those who have had long-term medical issues. And, because AI relies on specific rules created by humans, there is no way for the technology to check itself against employment law or ethical norms about employment discrimination. It would simply filter out applicants who don't meet the criteria.

Dr. Ajunwa is not the only one sounding the alarm about employers increasing reliance on AI and other tools, which creators purport to be objective. According to Cathy O'Neil, author of Weapons of Math Destruction, such algorithmic bias is common in hiring, especially in low-wage jobs where massive retail companies rely on sophisticated AIs that consider aspects of your life you would not think have any bearing on employment, such as your credit score, medical and mental health histories, personality tests, and driving record.

In recent years, several lawsuits and investigations regarding AI discrimination have appeared and several researchers in tech have started to develop methods to illuminate the hidden bias in machine learning and AI technologies. However, as Dr. Ajunwa notes, there are few concrete laws on the books that can protect applicants from algorithmic discrimination. Moreover, the Harvard Business Review cautioned that unlike other forms of employment testing, many of these AI-based tools remain empirically untested, leaving the door open to to ethical and legal problems.

The New Parenting

August 24, 2020
Paid Family Leave
Pregnancy Discrimination
This week, we’re going to spotlight one of the hot button issues at the intersection of employment and pandemic: how parents are going to cope in a fall without schools.

This Week’s FFCRA Complaints: The Wrongful Terminations Continue 

August 21, 2020
Leave
Disability Discrimination
Since we started this weekly blog post in May, we've read and summarized over 50 complaints filed under the new leave law. As we’ve pointed out, many of these complaints follow almost a template, with workers being terminated for either taking legally-allowed precautions to protect fellow workers from potential infection or for having legitimate reasons to take leave, often to care for a family member or child.

In an Uncommon Move, McDonald’s Sues Former CEO

August 20, 2020
Sexual Harassment
It’s not every day that a blue chip company decides to sue a former executive, let alone its erstwhile CEO, but this is exactly what McDonald’s did by suing Steve Easterbrook, who had been fired last year for inappropriate conduct, specifically, sexting with an employee.

Get In Touch

Knowing where to turn in legal matters can make a big difference. Contact our employment lawyers to determine if we can help you.