Is Your Résumé Doomed? Shocking Lawsuit Reveals How AI Could Be Sabotaging Your Career
In an era where technology dictates much of our professional landscape, the process of landing a job has transformed dramatically. Gone are the days of handwritten cover letters and in-person interviews as the primary gatekeepers to employment. Today, applicant tracking systems (ATS), powered by complex algorithms and artificial intelligence (AI), have become the first hurdle for millions of job seekers. These systems promise efficiency, streamlining the hiring process for employers by sifting through thousands of résumés in seconds. But what happens when these digital gatekeepers inadvertently—or systematically—exclude qualified candidates based on factors like age, race, or disability? A landmark lawsuit against Workday, a leading provider of HR software, is shining a spotlight on this very question, with one man, Derek Mobley, at the forefront of the battle to hold AI accountable.
The Rise of Algorithmic Hiring
The modern job market is a digital battlefield, with millions of applications submitted online each year. According to a 2023 report by the Society for Human Resource Management (SHRM), over 95% of large organizations in the U.S. rely on ATS to manage their recruitment processes. These systems, developed by companies like Workday, Taleo, and iCIMS, use algorithms to parse résumés, score candidates, and filter out those deemed unfit for a role. The appeal is clear: in 2024, the average corporate job posting received 250 applications, and ATS can reduce the time spent reviewing résumés by up to 75%, per a study by Jobscan.
Yet, this efficiency comes at a cost. A 2022 study from Harvard Business School found that ATS often reject up to 75% of qualified candidates due to overly rigid criteria, such as missing specific keywords or having employment gaps. These rejections often occur without human oversight, leaving applicants in the dark about why they were passed over. For many, the experience is not just frustrating but dehumanizing, as rejection emails arrive within hours—or even minutes—of submission, often at odd hours like 1:50 a.m., as Derek Mobley experienced.
Derek Mobley’s Crusade Against the Algorithm
Derek Mobley, a 50-year-old IT professional from North Carolina, found himself caught in this algorithmic trap. Between 2017 and 2019, and in the years that followed, Mobley applied to over 100 jobs, many through Workday’s platform, only to face consistent rejection or silence. A Black graduate of Morehouse College with a finance degree and an associate degree in network systems administration, Mobley also lives with anxiety and depression. Despite his qualifications and nearly a decade of experience in finance, IT, and customer service, he was repeatedly denied even an interview.
The sheer consistency of his rejections—100% failure rate across over 100 applications—defied statistical probability, Mobley argued. “In statistics, there’s a bell curve. It didn’t make sense that I was always on the losing end,” he told reporters. Suspicious that Workday’s algorithm was flagging his profile based on protected characteristics like his age, race, or mental health, Mobley filed a lawsuit against the company in 2023, alleging discrimination. His case, Mobley v. Workday, Inc., has since become a lightning rod for debates about AI bias in hiring.
The Workday Lawsuit: A Legal Turning Point
In May 2025, a federal judge in California, Rita Lin, ruled that Mobley’s age-discrimination claim could proceed as a collective action, potentially representing millions of job seekers over 40. This decision marks a significant milestone, as it challenges the long-standing opacity of ATS and their role in employment decisions. While the court dismissed claims of intentional discrimination, it allowed Mobley to pursue a “disparate impact” theory—meaning that Workday’s technology may unintentionally disadvantage certain groups, even if bias was not the explicit intent.
The lawsuit alleges that Workday’s algorithms, which score and rank applicants based on résumé keywords and employer-set criteria, may perpetuate biases embedded in the data they’re trained on. For instance, if a company’s existing workforce is predominantly young, white, or male, the algorithm might prioritize candidates who resemble that demographic, effectively sidelining others. Mobley’s legal team argues that details like his graduation from Morehouse College, a historically Black institution, or his age (inferred from graduation years), could have been used to infer his race or age, leading to automatic rejections.
The case also raises concerns about personality tests, which some Workday clients use as part of their screening process. Mobley alleges that these tests may disproportionately penalize candidates with mental health conditions like anxiety or depression, as certain responses could be interpreted as less desirable by the algorithm. A 2024 report by the Equal Employment Opportunity Commission (EEOC) noted that such tests can violate the Americans with Disabilities Act (ADA) if they screen out candidates based on protected characteristics without job-related justification.
The Black Box of AI Hiring
One of the central issues in Mobley’s case is the lack of transparency in how ATS operate. For job seekers, these systems are a “black box”—an opaque process where the criteria for rejection or advancement are rarely disclosed. According to a 2023 study by the Pew Research Center, 60% of job seekers believe they’ve been unfairly screened out by ATS, yet only 10% of employers provide feedback on why applications were rejected. This lack of clarity fuels distrust, as candidates like Mobley are left to speculate whether their age, race, or other factors played a role.
Experts like Kathleen Creel, a computer scientist at Northeastern University, point out that ATS can introduce bias in multiple ways. Mechanical errors, such as misinterpreting job titles or undervaluing certain qualifications, can disadvantage candidates. More insidiously, algorithms trained on historical hiring data may replicate past discriminatory practices.
For example, a 2021 study by Princeton University found that AI hiring tools trained on résumés from male-dominated industries often downgraded candidates with female-associated traits, such as mentioning “softball” instead of “baseball” on a résumé.
The Workday case could force companies to lift the veil on these algorithms. If Mobley’s team gains access to Workday’s applicant data during the discovery phase, they could conduct statistical analyses to prove disparate impact—showing, for instance, that older or Black applicants were rejected at significantly higher rates than younger or white ones. Such evidence could set a precedent for holding both software vendors and employers liable under federal anti-discrimination laws like Title VII, the ADA, and the Age Discrimination in Employment Act (ADEA).
The Broader Impact on Employers and Job Seekers
The implications of Mobley v. Workday extend far beyond one company. If the lawsuit succeeds, it could reshape how organizations use AI in hiring. A 2024 survey by Gartner found that 68% of HR leaders are concerned about potential bias in AI tools, yet only 22% conduct regular audits to detect it. Regulations like New York City’s Local Law 144, enacted in 2023, already require employers to audit ATS for race and gender bias annually, but Mobley’s case could push for broader mandates, including age and disability.
For job seekers, the case highlights the challenges of navigating a system where algorithms hold disproportionate power. A 2025 report by the U.S. Department of Labor estimated that 40% of unemployed workers over 40 face “significant barriers” in securing jobs due to automated screening tools. Black and Hispanic applicants also face higher rejection rates, with a 2023 study by the National Bureau of Economic Research finding that ATS rejected minority candidates at a 20% higher rate than white candidates with similar qualifications.
Mobley’s experience is not unique. Jill Hughes, another plaintiff in the lawsuit, reported receiving rejections for hundreds of roles, often within hours of applying, with some emails falsely claiming she didn’t meet minimum requirements. These rapid, automated rejections—sometimes sent at odd hours—suggest a lack of human oversight, a point Mobley’s legal team emphasizes. The EEOC, in an amicus brief filed in April 2024, argued that Workday’s role as an “agent” of employers makes it liable for discrimination, as its software performs functions traditionally handled by human recruiters.
Workday’s Defense and Industry Pushback
Workday maintains that its software is not discriminatory. In court filings, the company argues that its ATS matches résumé keywords to employer-specified criteria, scoring candidates as “strong,” “good,” “fair,” or “low” matches. Employers, not Workday, make final hiring decisions, and any “knockout questions” (e.g., about work authorization) are set by clients. “There’s no evidence that our technology harms protected groups,” a Workday spokesperson stated.
The company has also taken steps to address AI ethics. Since 2022, Workday has maintained a team dedicated to ensuring its products meet ethical AI standards, led by Kelly Trindel. At a 2025 NYU Law School conference, Trindel emphasized that Workday’s clients demand transparency about how its tools are developed. However, the company has resisted some regulatory efforts, such as New York City’s bias audit law, arguing for less stringent rules.
Industry experts are divided. Some, like those at HR Tech Analyst, argue that ATS are customizable by employers, meaning any bias stems from client settings, not the software itself. Others, like Ifeoma Ajunwa of Emory University, contend that software vendors share responsibility, as their algorithms can amplify existing biases if not carefully designed. A 2024 report by the Algorithmic Justice League found that 80% of ATS vendors fail to conduct pre-release bias testing, leaving employers vulnerable to legal risks.
The Human Toll of Algorithmic Rejection
For Mobley, the stakes are personal as well as systemic. The years of rejections took a toll on his mental health, finances, and retirement savings. To stay afloat, he drove for Uber and took short-term contract jobs, a common reality for many job seekers. A 2024 survey by LinkedIn found that 55% of unemployed workers reported mental health challenges due to prolonged job searches, with 30% citing ATS rejections as a primary stressor.
Mobley’s eventual success—landing a job at Allstate through a human recruiter in 2019, followed by two promotions—underscores the value of human intervention. His role as a catastrophe controller, managing insurance claims, demonstrates his qualifications, making his earlier rejections all the more perplexing. “This isn’t about revenge,” Mobley said. “It’s about fairness and getting a chance to compete.”
As Mobley v. Workday moves into the discovery phase, the case could set a precedent for how AI is regulated in hiring. Legal experts predict that a finding of liability could lead to a wave of lawsuits against both software vendors and employers. The EEOC’s 2023 guidance on AI in hiring emphasizes that employers must ensure their tools comply with anti-discrimination laws, and Mobley’s case could extend this obligation to vendors like Workday.
To mitigate bias, experts recommend several steps:
Regular Bias Audits: Employers and vendors should conduct annual audits to identify and address disparities in hiring outcomes.
Human Oversight: Incorporating human review at key stages can prevent over-reliance on algorithms.
Transparency: Companies should disclose how ATS score and rank candidates, giving applicants a chance to understand and appeal decisions.
Diverse Training Data: Algorithms should be trained on diverse datasets to avoid replicating historical biases.
A 2025 study by the Brookings Institution suggests that implementing these measures could reduce discriminatory outcomes by up to 40%. However, adoption remains slow, with only 15% of companies complying with existing audit requirements, per a 2024 SHRM survey.
Derek Mobley’s lawsuit is more than a personal quest—it’s a challenge to the unchecked power of algorithms in shaping who gets a seat at the table. As AI continues to dominate hiring, the need for transparency and accountability grows urgent. The outcome of Mobley v. Workday could redefine the rules for digital recruitment, ensuring that technology serves as a bridge to opportunity rather than a barrier. For now, Mobley and millions of job seekers await answers, hoping to crack open the black box that has kept them on the sidelines for too long.