top of page

Algorithmic Hiring and Title VII: Why Existing Doctrine Still Works

  • Writer: Dylan Arsenault
    Dylan Arsenault
  • Apr 17
  • 6 min read

As artificial intelligence continues to expand within employment decision-making processes, employers increasingly rely on automated tools to enhance hiring efficiency. Algorithmic screening programs compile information from applications, compare candidates to job descriptions, and predict which applicants are most likely to succeed in a role.[1] These tools promise speed and consistency, but they also raise substantial concerns regarding discrimination and systemic bias in hiring decisions.[2]


These concerns are not merely theoretical. When hiring decisions are based on an applicant’s similarity to historically successful employees, the process risks replicating patterns of exclusion that employment discrimination law seeks to prevent.[3] An applicant may meet all formal qualifications, yet their application may never reach a hiring manager because an automated system ranked other candidates as more likely to succeed. Where that ranking is influenced by characteristics correlated with protected traits, the use of this technology implicates Title VII. [4] Despite calls for new regulatory frameworks, existing Title VII doctrine is capable of addressing algorithmic hiring discrimination.


Title VII liability can arise under either disparate treatment theory or disparate impact theory.[5] Disparate treatment requires proof of discriminatory intent, while disparate impact focuses on facially neutral practices that produce unequal outcomes.[6] Because most employers are unlikely to deploy intentionally discriminatory tools, disparate impact theory will often provide the most relevant framework for algorithmic based discrimination claims.


Under disparate impact doctrine, a plaintiff must (1) prove “a significant disparate impact on a protected class or group; (2) identify the specific employment practices or selection criteria at issue; and (3) show a causal relationship between the challenged practices or criteria and the disparate impact.”[7] Once that showing is made, the burden shifts to the employer to demonstrate that the practice is job related and consistent with business necessity.[8] Critics argue that hiring algorithms will easily satisfy this defense because they identify traits correlated with job performance.[9]  However, disparate impact doctrine already accounts for this concern. Even where an employer establishes business necessity, a plaintiff may still prevail by showing that an alternative employment practice would serve the same legitimate interest with less discriminatory effect.[10]


This concept is especially important in applying disparate impact theory to algorithmic hiring, where competing technologies perform similar evaluative functions. Because plaintiffs can prevail notwithstanding a legitimate business necessity, Title VII already provides a mechanism for intervention where one tool produces greater disparate impact than another available alternative.[11]


Courts are also capable of adapting existing doctrine to evidentiary challenges posed by algorithmic systems. Many hiring algorithms function as “black boxes,” meaning their internal decision-making processes are not readily observable.[12] In such cases, plaintiffs may be unable to isolate the precise criteria responsible for discriminatory outcomes; however, Title VII already permits courts to treat an entire decision-making process as the relevant employment practice where components cannot be separated.[13]


Applying this principle allows plaintiffs to rely on statistical comparisons between applicant pools and algorithm outputs to establish disparate impact.[14] The primary difficulty in algorithmic hiring litigation is therefore not doctrinal inadequacy but procedural clarity. In practice, successful plaintiffs will almost always need to demonstrate that a less discriminatory alternative exists.[15] For this reason, presuming critics are correct that the vast majority of hiring algorithms will satisfy the business necessity defense, requiring disparate impact plaintiffs to plead facts demonstrating that a lesser discriminatory alternative is available would align pleading standards with the practical operation of disparate impact doctrine.


Although this approach increases the burden plaintiff’s face at the pleading phase, recognizing this reality at the outset will incentivize employers to monitor their hiring algorithms and evaluate competing technologies, because properly doing so will minimize the risk of litigation by making it less likely that potential plaintiffs identify lesser discriminatory alternatives. 


Such monitoring promotes Title VII’s objective of eliminating discrimination from the workplace by reducing the number of employers that fail to utilize the least discriminatory hiring tools available.[16] Additionally, employers who compare systems to mitigate disparate impacts will detect discriminatory outcomes before they affect applicants.


Importantly, this approach avoids the need for sweeping legislative reform. Title VII was designed as a flexible statute capable of addressing evolving forms of workplace discrimination.


Judicial clarification of disparate impact doctrine in algorithmic contexts can ensure civil rights oversight without the delay that necessarily accompanies legislative reform. Title VII does not require reinvention to be equitably applied algorithmic hiring practices—it requires careful and deliberate application.


[1] See generally Zhisheng Chen, Ethics and Discrimination in Artificial Intelligence-Enabled Recruitment Practices, Humanities and Social Sciences Communications (Sept. 13, 2023), https://www.nature.com/articles/s41599-023-02079-x#citeas (discussing algorithmic bias resulting in discriminatory hiring practices); Algorithmic Hiring Systems: What Are They and What Are the Risks?, Institute for the Future of Work (Sept. 27, 2022), https://www.ifow.org/news-articles/algorithmic-hiring-systems (outlining common mechanisms of algorithmic hiring systems and exploring its impact in the United Kingdom) [hereinafter Algorithmic Hiring Systems].

[2] See generally Lydia X. Z. Brown, Hiring Discrimination by Algorithm: A New Frontier for Civil Rights and Labor Law, A.B.A. (Oct. 31, 2023), https://www.americanbar.org/groups/crsj/publications/human_rights_magazine_home/labor-and-employment-rights/hiring-discrimination-by-algorithm/ (discussing the discriminatory impact of automated hiring tools on marginalized communities).

[3] Algorithm-driven Hiring Tools: Innovative Recruitment or Expedited Disability Discrimination?, Center for Democracy & Technology, at 5 (Dec. 2020) https://cdt.org/wp-content/uploads/2020/12/Full-Text-Algorithm-driven-Hiring-Tools-Innovative-Recruitment-or-Expedited-Disability-Discrimination.pdf.

[4] See Title VII of the Civil Rights Act of 1964, 42 U.S.C. § 2000e-2(a).

[5] Id. at § 2000e-2(k); see also Hazen Paper Co. v. Biggins, 507 U.S. 604 (1993).

[6] See e.g. Hazen Paper, at 609 (1993).

[7] Bolden-Hardge v. Off. of Cal. State Controller, 63 F.4th 1215, 1227 (9th Cir. 2023); 42 U.S.C. § 2000e-2(k).

[8] 42 U.S.C. § 2000e–2(k)(1)(A)(i); see e.g., Griggs v. Duke Power Co., 401 U.S. 424 (1971).

[9] Solon Barocas & Andrew D. Selbst, Big Data's Disparate Impact, 104 Cal. L. Rev. 671, 709 (2016) (“there is good reason to believe that any or all of the data mining models predicated on legitimately job-related traits pass muster under the business necessity defense”); Pauline T. Kim, Data-Driven Discrimination at Work, 58 Wm. & Mary L. Rev. 857, 908 (2017) [hereinafter Data-Driven Discrimination] (asserting that any reasonably constructed model will satisfy the business necessity defense because it is designed to identify a statistical correlation with some aspect of job performance); Pauline T. Kim, Big Data and Artificial Intelligence: New Challenges for Workplace Equality, 57 U. Louisville L. Rev. 313, 326 (2019) (“mechanically applying existing disparate impact doctrine will be insufficient to address the risks of discrimination”); Gianfranco Regina, Do You Even Know Me?: A.I. and Its Discriminatory Effects in the Hiring Process, 51 Hofstra L. Rev. 1081, 1101 (2023) (“potential plaintiffs suing employers for A.I. discrimination will likely be unsuccessful because they lack access to the employer's inputs, which may be considered a trade secret”).

[10] See e.g. Lanning v. Se. Pennsylvania Transp. Auth. (SEPTA), 181 F.3d 478, 485 (3d Cir. 1999) (citing Albemarle Paper Co. v. Moody, 422 U.S. 405, 425 (1975)); N.A.A.C.P. v. N. Hudson Reg'l Fire & Rescue, 665 F.3d 464, 477 (3d Cir. 2011) (“A plaintiff can overcome an employer's business-necessity defense by showing that alternative practices would have less discriminatory effects while ensuring that candidates are duly qualified.”); 42 U.S.C. § 2000e–2(k)(1)(A)(ii), (C).

[11] See 42 U.S.C. § 2000e–2(k)(1)(A)(ii), (C).

[12] See Generally, Andrea D'Agostino, Introduction to Neural Networks — Weights, Biases and Activation, Medium (Dec 27,2021), https://medium.com/@theDrewDag/introduction-to-neural-networks-weights-biases-and-activation-270ebf2545aa#:~:text=Each%20neuron%20is%20connected%20to,not%20approach%20an%20acceptable%20solution.&text=The%20adjustment%20of%20weights%20and,neural%20networks%20are%20black%20boxes.; see also Regina, supra note 9, at 1084; Data Driven Discrimination, supra note 9, at 921– 22. Stephanie Bornstein, Antidiscriminatory Algorithms, 70 Ala. L. Rev. 519, 524 (2018).

[13] 42 U.S.C. § 2000e–2(k)(1)(B)(i); see Phillips v. Cohen, 400 F.3d 388, 398 (6th Cir. 2005).

[14] Mobley v. Workday, Inc., 740 F.Supp.3d 796, 810–11 (2024) (denying defendant’s motion to dismiss disparate impact claim).

[15] Baracoas & Selbst, supra note 9 at 709 (“there is good reason to believe that any or all of the data mining models predicated on legitimately job-related traits pass muster under the business necessity defense.”); Data Driven Discrimination, supra note 9, at 869 (asserting that any reasonably constructed model will satisfy the business necessity defense because is designed to identify a statistical correlation with some aspect of job performance)

[16] Ricci v. DeStefano, 557 U.S. 557, 580 (2009) (“the important purpose of Title VII—that the workplace be an environment free of discrimination”)

 
 

Contact Us:

© 2025 by the Western New England Law Review.

Western New England Law Review

Western New England University School of Law

1215 Wilbraham Road

Springfield, MA. 01119

bottom of page