Categories

Subscribe!

AI Job Search

If You’re Over 40, AI Can Harm Your Job Chances – Mary T. O’Sullivan

By Mary T. O’Sullivan, MSOL, contributing writer on business and leadership

“The Protect Older Job Applicants (POJA) lets workers choose to pursue age discrimination claims in court rather than being forced into arbitration”. – AARP

Age discrimination in hiring has always been a problem, but artificial intelligence (AI) is intensifying and reshaping how age discrimination impacts hiring of older workers. The Protect Older Job Applicants Act (POJA) offers a critical legal update to address these current challenges. By clarifying that disparate-impact protections under the Age Discrimination in Employment Act (ADEA) apply not just to employees but to applicants, POJA helps guard against exclusion via algorithmic that disproportionately hits older workers. To understand why POJA is so urgent today, it is essential to examine how AI contributes to age bias and to examine a few concrete cases where older jobseekers have been shut out.

AI’s Role in Age Discrimination

AI hiring tools are typically trained on historical hiring data. If a company’s past hiring favored younger workers—intentionally or not—the AI tool learns that age bias. For example, résumé-screening algorithms will pattern-match toward the kinds of candidates who were hired before. Over time, this entrenches a cycle of exclusion, even without any explicit age variable. When I first began my teaching career, almost every teacher in my school was recently out of college. I doubt anyone employed as an educator there was over 40. Imagine if the system of selecting candidates was based on the age of those teachers. (Of course, there was no AI back then, but essentially, this is how AI “learns” to select or eliminate candidates.)

AI can deduce age from indirect signals, even when the applicant’s age is not specifically provided. Key “proxies” include graduation years, total years of work experience, employment gaps, or outdated technologies listed on a résumé. These “proxies” act as hidden red flags: older applicants may be economically and unfairly penalized when their experience or education becomes a signal of “age.” I suggest to clients that they not include years of graduation from college or note that they are proficient on Wang computers.

Many AI-powered applicant tracking systems (ATS) reject résumés before any human eye ever sees them. Because these decisions are not clear—and often use proprietary software—it’s difficult for applicants to challenge or understand how or why they were rejected. Employers can claim that the algorithm made the decision, offering a convenient shield from their own accountability for discriminatory practices.

Discrimination can happen even before someone applies. Platforms like Facebook, LinkedIn, and Google enable employers to target job ads to specific demographic segments. In some cases, according to ProPublica, older workers are excluded entirely from even seeing job postings, effectively denying them the opportunity to apply. A federal lawsuit and shrewd investigative reporting found that Facebook’s ad tools have been used to narrow job ads’ reach in ways that are designed to exclude older adults.

AI systems sometimes evaluate “cultural fit” in ways that also disadvantage older workers. Whether via facial-recognition tools, video-interview analysis, or personality scoring, older candidates may be judged more harshly for traits that an algorithm misinterprets as lacking “energy” or “adaptability”, clearly discriminatory actions, based on appearance alone. However, there’s no way to avoid a Zoom or video interview, unless you’ve undergone cosmetic surgery.

Recent research also shows that generative AI—like large language models—can embed and amplify ageist stereotypes. In one Stanford study, when asked to generate resumes, ChatGPT tended to depict hypothetical older women as less experienced and younger than older men with the same background, contributing to gendered age bias, a shocking discovery for academics, however familiar to many women. Because AI is often treated as a “black box,” it becomes easier for companies to deflect blame: “the algorithm did it,” they can say. This weakens the ability of older applicants to challenge decisions, and it complicates oversight by regulatory bodies, when they are allowed to regulate.

Scholars, technologists, and policy advocates are pushing for stronger transparency, auditing, and validation of hiring algorithms. The Brookings Institute has argued that “meaningful standards” and independent audits are necessary to ensure fairness. Without legal guardrails, AI risks hiding blatant age bias under the guise of “efficiency”.

Encouragingly, recent research points to ways AI can be designed or “programmed” to reduce age bias. A recent paper on resume-screening shows how algorithms can be retrained to de-emphasize age “proxies” and emphasize transferable skills.  Other studies suggest “fairness-aware” ranking algorithms (used in systems like LinkedIn Talent Search) can help maintain diverse and inclusive candidate pools while preserving candidate selection quality

A Few Real-World Case Studies

Workday (AI hiring software)

In Mobley v. Workday, plaintiffs over 40 claimed that Workday’s algorithmic screening tool rejected their applications before any human eyes saw them. The court allowed the case to proceed. According to AARP, this raises novel legal issues around AI bias, Workday acting as a “hiring agent”, and discriminatory training data.

RTX / Raytheon Technologies Lawsuit

In 2024, the AARP Foundation sued RTX Corporation (formerly Raytheon) for allegedly posting job ads targeting recent graduates, which discouraged or excluded older applicants. The lawsuit claims these ads violated age-discrimination laws by effectively shutting older workers out of the recruiting funnel before they even saw the job posting. This case highlights how ageism can be baked into recruitment strategies—and how legal action is being used to push back, thanks to the efforts of AARP.

Facebook Ad Targeting Litigation

One of the most notable examples of algorithmic age exclusion emerged in lawsuits against employers using Facebook’s ad-targeting tools. Plaintiffs allege that Facebook’s system allowed companies to restrict job ads to narrow age ranges, effectively preventing older workers from seeing them. In a 2019 landmark EEOC decision, (prior to the Trump Administration changes to the EEOC) regulators found “reasonable cause” that several companies used Facebook job ads to exclude older workers. These legal developments underscore how digital ad platforms can be nefariously weaponized to bypass anti-discrimination laws.

Ageist Language in Job Advertisements

Academic studies also document how age stereotypes in job postings discourage older applicants. A paper from the National Bureau of Economic Research found that job ads containing multiple AI ageist phrases led to a 12-point drop in applicants over age 40, and the average age of applicants fell by 2.5 years. And a Cambridge University – based experiment showed that phrases linked to negative age stereotypes are reliably detected by machine-learning tools—and are perceived by human respondents as biased. The phrases “digital native”, “youthful team”, and “seeking young, energetic candidates” are good examples of ageist language.

These technological trends massively amplify the risk we older workers face—and make the need for clearer legal definitions more urgent. POJA would empower older applicants to challenge not just overt discrimination, but hidden, systemic algorithmic bias. By clarifying that disparate-impact liability extends to applicant-screening tools and ad-targeting systems, POJA would:

  • Force employers to justify automated tools by demonstrating business necessity and diminished disparate impact.
  • Encourage audits and transparency, making the use of age-related screening less difficult to grasp.
  • Align legal protections with modern recruiting, closing the gap between how hiring is done today and how the law treats age bias.

Without such legal safeguards, age discrimination risks becoming more invisible, more automated, and more entrenched.

Sadly, age discrimination in hiring is not just a relic of the past. In our era of AI and algorithmic recruiting, older job seekers face new and often hidden obstacles. From ad-targeting tools that exclude seniors from seeing job posts, to résumé-screening systems that penalize age “proxies”, to generative AI perpetuating stereotypes—technology can reinforce and intensify age bias.

The Protect Older Job Applicants Act (POJA) is much needed legislative response. By clarifying that disparate-impact protections apply at the applicant stage, POJA aligns law with the current technological reality. Real-world cases—like the RTX lawsuit and Facebook ad targeting litigation—show the real harm done by unregulated algorithmic hiring. Legal reform, combined with robust compliance auditing and AI design standards, offer a fix. POJA doesn’t just protect rights—it makes age equity in the AI-era possible.

“age discrimination is a huge problem for older workers … Forced arbitration … strips hard-working Americans … of a jury trial … making it more difficult for victims … to get justice.” – Sen. Kirsten Gillibrand (D – NY)

___

Mary T. O’Sullivan, Master of Science, Organizational Leadership, International Coaching Federation Professional Certified Coach, Society of Human Resource Management, “Senior Certified Professional. Graduate Certificate in Executive and Professional Career Coaching, University of Texas at Dallas.

Member, Beta Gamma Sigma, the International Honor Society.

Advanced Studies in Education from Montclair University, SUNY Oswego and Syracuse University.

Mary is also a certified Six Sigma Specialist, Contract Specialist, IPT Leader and holds a Certificate in Essentials of Human Resource

Mary T. O’Sullivan, MSOL, ICF-PCC, SHRM-SCP,  BCC
Hogan Assessment Practitioner
EQi2.0-EQ360 Practitioner
Appreciative Inquiry Practitioner
Six Sigma Specialist, Certified IPT Leader, Certified Contracts Manager
Helping good leaders get even better through positive behavior change.
401-742-1965

 

Leave a Comment