Judge weighs arguments in Workday AI bias case | TechTarget (2024)

Judge weighs arguments in Workday AI bias case | TechTarget (1)

sdecoret - stock.adobe.com

A judge in San Francisco is considering whether existing laws apply to AI bias claims in hiring, amid allegations that Workday's software discriminates against certain applicants.

Judge weighs arguments in Workday AI bias case | TechTarget (2)

By

  • Patrick Thibodeau,Editor at Large

Published: 03 Jun 2024

A federal judge in San Francisco is considering whether existing anti-discrimination laws apply to AI systems or if new legislation is needed to address concerns of AI bias in hiring practices. The case, Derek L. Mobley vs. Workday, centers on allegations that the HR software provided by Workday discriminates against job applicants who are Black, disabled or older.

During a recent hearing to consider Workday's motion to dismiss the case, U.S. District Judge Rita Lin probed both parties on the responsibilities and liabilities of HR software vendors such as Workday. "What troubles me about Workday's interpretation is the concept that the employer would not be liable for intentional discrimination unless they knew that the software was doing something that was intentionally discriminatory," Lin said, according to a court transcript of the hearing.

Mobley, the plaintiff, argues that Workday's algorithmic tools have systematically screened out his applications due to his race, disability and age. Mobley's attorney, Lee Winston of Winston Cooks LLC in Birmingham, Ala., contends that this violates Title VII of the Civil Rights Act.

If your company created the product, you're responsible for it.

"There's no software vendor exemption written into the statute," Winston argued. "If your company created the product, you're responsible for it. It's that straightforward."

Workday's defense, led by Erin Connell of Orrick, Herrington & Sutcliffe LLP in Austin, counters that Title VII covers only employers, not the vendors that supply the tools. Connell pointed to recent guidance issued by the Office of Federal Contract Compliance Programs that puts responsibility on federal contractors to ensure the tools they use aren't discriminatory.

"Employers are responsible for the third-party tools that they purchase if they end up being discriminatory, even if the employer didn't know," Connell told the judge.

This AI bias case has attracted the interest of regulators. The U.S. Equal Employment Opportunity Commission (EEOC) recently filed an amicus brief that urges the court not to dismiss Mobley's claims, calling them plausible. Workday makes hiring assessments and tests used by employers in hiring.

Two labor and employment legal experts not involved in the case reviewed copies of the hearing transcript provided by TechTarget Editorial and give their take.

The top question

The threshold question that the court is considering, said Dean Rocco, co-chair of the employment and labor practice at law firm Wilson Elser, is whether technology providers -- such as makers of HR systems used in recruiting -- "can be viewed as employers or otherwise covered by applicable employment laws."

In the hearing, Rocco said the judge pressed the parties to explain why an AI tool provider shouldn't be held liable under the statutes, even if it intentionally set up a system that it knew would discriminate against candidates.

Workday is arguing, "it simply provides a technology platform that the end-user employer chooses to use, and the liability should rest with that end user," Rocco said. Conversely, the plaintiffs contend that the AI-based recruiting tool "is engaging in an administrative gatekeeping function that is traditionally performed by employers," he said.

Michael Elkins, partner and founder at MLE Law in Fort Lauderdale, Fla., said the judge seemed genuinely interested in the question, "What do we do with these AI companies?"

The bulk of the judge's questions about the merits of the lawsuit went to Workday's lawyer, Elkins said. "Now that could mean 'I don't buy it,' or 'I totally buy it and just want you to back it up on the record.'"

Elkins said the judge seemed concerned about Workday's argument that AI vendors would not be covered under current anti-discrimination laws and that it would take action by Congress to change that fact.That could potentially create a legal "loophole" where neither the vendor nor the employer would be responsible for discriminatory outcomes because of the uncertainty of the law.

In court, Connell said that "the solution that plaintiff and the EEOC are seeking here truly is a legislative solution. Congress could amend Title VII. It could introduce new laws."

Patrick Thibodeau is an editor at large for TechTarget Editorial who covers HCM and ERP technologies. He's worked for more than two decades as an enterprise IT reporter.

Related Resources

Dig Deeper on Talent management

  • Judge overseeing Workday AI lawsuit has questionsBy: PatrickThibodeau
  • Workday's AI lawsuit defense puts responsibility on usersBy: PatrickThibodeau
  • Genesys Cloud CX gets FedRAMP certified for government useBy: MaryReines
  • As HR adopts AI in hiring, the risks are mountingBy: PatrickThibodeau
Judge weighs arguments in Workday AI bias case | TechTarget (2024)

References

Top Articles
Latest Posts
Article information

Author: Chrissy Homenick

Last Updated:

Views: 5984

Rating: 4.3 / 5 (74 voted)

Reviews: 81% of readers found this page helpful

Author information

Name: Chrissy Homenick

Birthday: 2001-10-22

Address: 611 Kuhn Oval, Feltonbury, NY 02783-3818

Phone: +96619177651654

Job: Mining Representative

Hobby: amateur radio, Sculling, Knife making, Gardening, Watching movies, Gunsmithing, Video gaming

Introduction: My name is Chrissy Homenick, I am a tender, funny, determined, tender, glorious, fancy, enthusiastic person who loves writing and wants to share my knowledge and understanding with you.