The emergence of AEDT (Automated Employment Decision Tool) laws represents a fundamentally new compliance layer for HR teams — one that sits at the intersection of employment law, algorithmic accountability, and data science. Unlike EEOC or GDPR compliance, which have been part of the HR landscape for decades, AEDT compliance is being invented in real time, with regulatory agencies still issuing implementation guidance and vendors still working out what their obligations actually are.
The good news is that the core requirements, while technically complex, are operationally manageable if you understand them clearly. The challenging part is that the compliance obligation is shared between you (the employer) and your ATS vendor — and determining which party is responsible for which element requires understanding both the law and your contractual relationship with your vendor.
This guide covers the current AEDT law landscape, what the bias audit requirement actually involves, and the questions every HR team needs to ask their ATS vendor before using AI-assisted hiring features in covered jurisdictions. It is not legal advice, and the regulatory environment is changing quickly — confirm current requirements with employment counsel for your specific situation and hiring geographies.
What AEDT laws require: the core framework
The foundational AEDT legislation is New York City Local Law 144, which took effect for employment decisions on July 5, 2023, with enforcement beginning January 1, 2024. It established the framework that subsequent state laws have largely followed. The law applies to employers and employment agencies that use an AEDT in employment decisions for roles based in NYC.
What qualifies as an AEDT
An AEDT under NYC Local Law 144 is "a computational process derived from machine learning, statistical modelling, data analytics, or artificial intelligence that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision-making for employment decisions."
This definition is broad enough to capture: automated CV scoring systems that rank applicants, AI-powered candidate matching that surfaces recommendations, video interview analysis tools that assess candidates using facial recognition or speech analysis, automated skills assessment scoring, and any algorithmic system that produces a score or recommendation that influences whether a human decision-maker proceeds with or eliminates a candidate.
What is more likely to fall outside the definition: ATS features that apply employer-defined criteria (if the recruiter tells the system to only show candidates who have "Python" in their CV, that's a rule, not an automated decision). Basic database search functionality. Scheduling automation. Document management. The distinction that matters is whether the tool itself is making or substantially influencing the assessment of a candidate's suitability, or whether it is simply executing a human-defined filter.
The bias audit requirement
Employers and employment agencies must ensure that any AEDT they use has been subjected to an independent bias audit within the past year. The audit must:
- Be conducted by an independent auditor — not the employer, not the ATS vendor, not an entity with a financial interest in the outcome.
- Assess the AEDT's impact by sex categories (male, female, non-binary where data permits) and race/ethnicity categories.
- Calculate selection rates and impact ratios for each category using the four-fifths rule or an equivalent statistical methodology.
- Produce a summary document that is made publicly available on the website of either the employer or the tool vendor.
The candidate notification requirement
Employers must notify candidates that an AEDT will be used in their evaluation before the tool is applied. The notification must include: the fact that an AEDT will be used, the data the AEDT uses to make its assessment, and information about how candidates can request an alternative selection process that does not use the AEDT. This notification requirement sits in your ATS workflow — it needs to be part of the application confirmation or pre-evaluation communication.
The expanding AEDT law landscape
NYC Local Law 144 was first, but it is not alone. The following jurisdictions have enacted or are actively developing AEDT-related requirements:
Illinois (AI Video Interview Act, amended 2022): Requires employers using AI to analyze video interviews to notify applicants before the interview that AI analysis will be used, explain how the AI works and the general characteristics it evaluates, and obtain applicant consent. Employers must share aggregate demographic breakdown of candidates who proceed past AI screening upon request from candidates or the Illinois Department of Human Rights.
Colorado (SB 24-205, effective 2024): The Colorado Artificial Intelligence Act covers algorithmic discrimination in high-risk AI systems including employment, requiring impact assessments and risk management documentation by deployers of covered AI systems. The specific application to ATS tools is broader than NYC's definition — Colorado's law captures a wider range of AI-assisted decisions.
California: Multiple bills addressing AI in employment decisions have been proposed. California's existing CCPA framework already creates some data rights for candidates interacting with AI systems. Active legislation should be monitored.
Federal level: The EEOC has issued guidance on the applicability of Title VII to algorithmic hiring tools — specifically, that employers cannot avoid Title VII liability by pointing to a vendor's algorithm as the cause of discriminatory outcomes. The employer remains responsible for the impact of tools they use in hiring decisions.
The tension between AI efficiency and AEDT compliance
This is the operational complexity that HR leaders are genuinely wrestling with in 2026. AI screening features promise significant efficiency gains: faster CV review, better candidate matching, reduced time-to-screen. But using those features in NYC or other covered jurisdictions now triggers bias audit obligations, candidate notification requirements, and alternative selection process availability.
For many mid-market companies, the practical calculation is: does the efficiency gain from the AI screening feature justify the compliance overhead of the AEDT framework? The answer depends on several factors: your hiring volume in covered jurisdictions, your ATS vendor's maturity in providing bias audit documentation, and your legal team's comfort level with the current regulatory uncertainty.
The important nuance: having the bias audit completed by your vendor and posted publicly is not the same as your hiring process being free of adverse impact. The audit is a retrospective assessment of past performance. If the audit reveals adverse impact against a protected class, you have a compliance obligation to address it — either by modifying the tool's parameters, adding human review checkpoints, or discontinuing the tool's use for the affected decision type.
What to ask your ATS vendor about AEDT compliance
The quality of ATS vendors' responses to AEDT questions varies significantly. Some vendors have invested heavily in independent bias auditing and have published results. Others are still in the process of determining which of their features qualify as AEDTs. Some have not seriously engaged with the compliance question at all.
AEDT vendor evaluation questions
- Feature classification: "Which of your features qualify as AEDT under NYC Local Law 144 and Colorado SB 24-205? Provide a written list."
- Bias audit documentation: "Have you commissioned an independent bias audit of those features? Who conducted it? Where can I see the published results? How recent is the most recent audit?"
- Candidate notification support: "Does the platform support the candidate notification requirements under NYC Local Law 144 — the notice that an AEDT will be used, the data types used, and information about requesting an alternative process? Show me how that notification appears in the application workflow."
- Alternative selection process: "What alternative selection process can you support for candidates who opt out of AEDT-based screening? How does the system route those candidates?"
- Employer documentation: "What documentation do you provide to employers to support their own AEDT compliance obligations — specifically the record-keeping requirements under NYC law?"
- Illinois AI Video Act: "If I use your video interview AI analysis feature for candidates in Illinois, what candidate notification functionality does the platform provide? How does it collect and record candidate consent?"
Employer responsibility versus vendor responsibility: who owns what
The compliance obligation under AEDT laws sits with the employer, not the vendor. NYC Local Law 144 is explicit: it is the employer's obligation to ensure that any AEDT used in employment decisions has been subjected to an independent bias audit. You cannot contractually shift this obligation to your ATS vendor, and "my vendor handles that" is not an acceptable response in an enforcement investigation.
What vendors are responsible for: providing the independent bias audit documentation, publishing the results publicly (either on the vendor's website or the employer's website), providing candidate notification support in the application workflow, and providing the alternative selection process infrastructure for candidates who opt out.
What employers are responsible for: ensuring they are using AEDT features only when an audit is available and current, implementing the candidate notification in their application workflow, making the alternative selection process genuinely available, and maintaining records of their compliance program.
The practical implication: if your ATS vendor has not commissioned an independent bias audit of their AI screening features, and you are using those features for hiring decisions in NYC or other covered jurisdictions, you have a compliance gap — regardless of what your contract says. The solution is either to get the vendor to complete the audit, to commission the audit yourself as the employer, or to disable the AEDT features in covered jurisdictions until the audit is available.
One area of genuine complexity: the law covers AI tools used by employers and by employment agencies. If you use a recruiter or staffing agency that uses AI screening tools to source candidates for you, your AEDT obligations extend to those tools. In practice, this means the vendor evaluation questions in the next section should be asked of any third-party recruiter or staffing firm using AI on your behalf, not only your internal ATS.
Practical steps for auditing your current AEDT exposure
Step 1: Inventory your AI features. List every feature in your current ATS that uses AI, machine learning, or automated scoring. For each: does it produce a score, ranking, or recommendation that influences hiring decisions? If yes, it is likely an AEDT in covered jurisdictions.
Step 2: Identify covered jurisdictions. For each AI feature, do you use it for hiring decisions in NYC, Illinois (video interviews), or Colorado? If yes, the relevant AEDT law applies.
Step 3: Request vendor bias audit documentation. Ask your vendor for published bias audit results for each AEDT feature. If they don't have them, you have a compliance gap that needs to be addressed either by getting the vendor to commission an audit, commissioning one yourself, or discontinuing use of the feature in covered jurisdictions until an audit is available.
Step 4: Review your application workflow for candidate notifications. Is the AEDT disclosure appearing in your application confirmation communications? Does it include all required elements (what the AEDT does, what data it uses, how to request an alternative)?
Step 5: Establish an alternative selection process. Document the alternative process available to candidates who opt out of AEDT screening. This doesn't need to be complex — human review of applications that bypass the AI scoring layer is a valid alternative — but it needs to exist and be described accurately in your disclosure.
Built with compliance in mind from day one
GDPR-native data privacy. All compliance features included in every plan. Startup: $299/mo · Growth: $499/mo · Scale: $899/mo.
See full pricing →Frequently asked questions
What is an AEDT bias audit?
An independent evaluation of an automated hiring tool that assesses whether it produces disparate impact against protected classes. Under NYC Local Law 144, the audit must be conducted by an independent third party, assess impact by sex and race/ethnicity, calculate selection rates and impact ratios using the four-fifths rule, and have results published publicly. The audit must be conducted annually.
Which states require bias audits of hiring tools?
New York City (Local Law 144, enforcement began January 2024) is the most specific and developed AEDT bias audit requirement. Illinois requires notification and consent for AI video interview analysis. Colorado's SB 24-205 (effective 2024) covers algorithmic discrimination in employment decisions broadly. Several other states have proposed or are developing similar legislation. The regulatory landscape continues to evolve.
Does AI resume screening require a bias audit?
If the AI screening produces a score, ranking, or recommendation that substantially influences hiring decisions in NYC or other covered jurisdictions, it almost certainly qualifies as an AEDT and requires an independent bias audit under NYC Local Law 144. ATS features that apply recruiter-defined filters (not algorithmic scoring) are in a greyer area. Ask your vendor to confirm the AEDT status of each AI feature in writing.
How do I find an ATS that provides AEDT compliance documentation?
Ask vendors directly: which features qualify as AEDT, has an independent bias audit been conducted, and where are the published results? Vendors who have addressed this seriously will have clear answers and published documentation. Vendors who haven't will be unable to provide the published audit results that AEDT laws require. The absence of this documentation is itself a compliance gap for employers using the tool in covered jurisdictions.
What does "substantially assist or replace discretionary decision-making" mean in practice?
Under NYC Local Law 144, an AEDT must "substantially assist or replace" human decision-making. Regulators have interpreted "substantially assist" broadly: if a hiring manager is shown an AI-generated score or recommendation and typically acts on it, that AI output is substantially assisting the decision. The fact that a human technically makes the final call does not remove the AEDT classification if the human decision is routinely driven by the algorithmic output. The practical implication is that any AI-generated ranking or score visible to hiring managers during evaluation likely qualifies as an AEDT.