For years, job seekers have described the same experience. They apply for jobs through Workday-powered portals. They wait. Then, they get rejected—sometimes within seconds. No human contact. No feedback. Just silence.
Now, a federal court is demanding answers.
In Mobley v. Workday, a class-action lawsuit brought by a man who says he was rejected more than 80 times by companies using Workday’s hiring system, a judge has ordered the company to disclose which employers enabled its AI-powered resume filtering tools. At the center of that system is a scoring engine called HiredScore—software designed to evaluate candidates algorithmically before a recruiter ever looks at their resume.
Workday is one of the most widely used enterprise software platforms in the United States. It handles payroll, internal HR, and talent acquisition for more than 10,000 organizations, including Amazon, Deloitte, Pfizer, Target, Salesforce, and General Motors. Even the U.S. Department of Veterans Affairs uses Workday’s infrastructure to modernize its hiring and resource planning systems. While job seekers may assume they’re submitting an application directly to an employer, in most cases they’re handing it over to Workday. The form may carry the company’s logo, but the backend is operated by Workday’s cloud. Once inside, the application is parsed, scored, and routed—often without human involvement.
Many of these employers have also adopted HiredScore, the resume ranking tool now at the center of the lawsuit. Before it was acquired by Workday in 2024, HiredScore already counted more than 40 Fortune 100 companies as clients. Public case studies and marketing materials show it was in active use at Johnson & Johnson, BASF, and Allegis Global Solutions. According to the filings, some companies enabled HiredScore’s automated filtering features—meaning that an applicant could be rejected before any human ever saw their name.
If an employer enables HiredScore within Workday, the resume is scored against the job description and a database of other profiles. The system examines work history, keyword alignment, title progression, and inferred skill sets. If the match score falls below a certain threshold, the application may be filtered out entirely. The result is that some applicants receive near-instant rejections. Others never hear anything at all.
The plaintiff, Derek Mobley, is over 40 and lives with anxiety and depression. His legal team alleges that Workday’s systems disproportionately rejected older applicants and people with nontraditional employment histories. The court agreed to allow the case to proceed under the Age Discrimination in Employment Act, certifying a nationwide class of job seekers aged 40 and above who applied through Workday from September 2020 onward.
“It feels like screaming into the void,” one Redditor posted. “No feedback. No human. Just a wall.”
HiredScore was founded in 2012 and marketed itself as a corrective to unconscious bias in hiring. Its founder, Athena Karp, told The Australian, “A lot of early hiring tech simply reinforced the preferences of hiring managers. We knew AI could do better if we built it to serve fairness and compliance first.” She described the platform as a way to “augment—not replace—human decision-making,” and claimed it was designed to “ignore details like name, zip code, or school that may reinforce bias.”
Athena Karp has said publicly that a core goal of HiredScore was to ensure applicants receive feedback rooted in skills—not arbitrary factors like name or education.
THE ANSWER NEVER COMES DOWN TO SOMETHING ARBITRARY
Workday acquired HiredScore in 2024 and has since folded its features into its broader hiring suite. But legal filings show that the AI was already active on the platform—and quietly screening applicants—well before the acquisition.
“You can do everything right and still not get the job,” one frustrated job seeker wrote. “It doesn’t mean you’re not qualified. It means you never made it past the algorithm.”
THE ZERO LUX is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
While HiredScore founder Athena Karp publicly emphasized the platform’s commitment to fairness—telling The Australian that “a lot of early hiring tech simply reinforced the preferences of hiring managers” and that her team “knew AI could do better if we built it to serve fairness and compliance first”—the lawsuit now questions whether those ideals translated into real-world outcomes for applicants.
Across Reddit, LinkedIn, and job forums, applicants describe the same pattern: silence, auto-rejections, vanishing listings.
“I got a rejection email within three minutes of submitting. There’s no way it was reviewed by a person.”
“They’re filtering by buzzwords, not people. You can be qualified and still lose to someone with a prettier resume layout.”
Some hiring managers have even voiced their frustrations. One wrote, “If the system doesn’t rank you high enough, I may never even see your name. That’s not how hiring should work.”
Robert Dunning, a senior operations leader who has written about flaws in the hiring pipeline, said, “Qualified people are being overlooked, automated out, or passed up without a real reason. It’s not personal. It’s systemic. And it’s frustrating.”
The court rejected Workday’s argument that HiredScore operated independently or that responsibility rested solely with the employers who configured it. In a ruling issued July 31, the judge ordered Workday to produce a list of every company that used HiredScore features—including any that sorted, ranked, or auto-rejected candidates—by August 20.
Amanda Goodall, a career strategist who has tracked hiring AI for years, wrote, “Workday tried to say they didn’t even own HiredScore when the discrimination lawsuit started. But the judge said nope—if your platform enabled the tech, you’re in.”
Another commenter put it bluntly: “This should be huge. Job sites and major corporations engaged in massive discrimination. Age? Just the tip of the iceberg.”
Although HiredScore has positioned itself as an ethical AI platform, the lawsuit challenges whether those safeguards worked in practice. Critics argue that even AI systems designed to exclude demographic indicators can still replicate structural bias by using proxies like job gaps, outdated titles, or overlapping experience that often correlate with age. The court has not ruled on whether discrimination occurred—but it has ruled that the plaintiffs have a legitimate claim worth testing.
Workday has maintained that it functions as a vendor, not a hiring agent. But the court found that the company may have acted as an agent on behalf of its clients by administering the first pass of applicant screening—meaning it could be held to the same legal standard as the employers themselves.
This legal battle coincides with a broader shift in how regulators are approaching algorithmic hiring. In New York City, Local Law 144 now requires annual bias audits for AI screening tools, notice to candidates when automation is used, and public disclosure of results. Colorado has passed a statewide AI law set to take effect in 2026, with similar requirements for explainability and auditability. California and Illinois are expected to follow.
One HR executive reacting to the court order wrote, “All these companies talk about responsible AI. But you ask them to show you how it works, and suddenly it’s proprietary.”
Workday’s systems have processed over a billion applications since 2020. The lawsuit doesn’t allege intent, but it argues that an entire class of applicants may have been systemically excluded—without oversight, explanation, or appeal.
For years, millions assumed they were being overlooked by bad luck or poor timing.
The truth may be more mechanical than personal.
Not ghosted. Just filtered out.
And for the first time, the system that did it may be required to explain itself.
THE ZERO LUX is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.