State Compliance11 min readFebruary 23, 2026

Washington State AI Hiring: Current Laws & Future Regulations

Washington has not yet enacted comprehensive AI-specific hiring legislation like Illinois, Colorado, or California—but several proposals are advancing through the legislature, and existing employment and privacy protections already apply to AI tools. Here's what Washington employers need to know.

DB
Devyn Bartell
Founder & CEO, EmployArmor
Published February 23, 2026

🔔 Important Context: No Comprehensive Law Yet

As of February 2026, Washington State has not enacted a comprehensive AI-specific hiring law comparable to Illinois HB 3773, NYC Local Law 144, or Colorado SB24-205. However, several bills are advancing through the legislature, and existing state laws—including the Washington Law Against Discrimination (WLAD), My Health My Data Act, and proposed biometric privacy protections—create compliance obligations for employers using AI in hiring.

Washington employers should not interpret the absence of AI-specific legislation as permission to use AI hiring tools without oversight. Existing employment discrimination laws, consumer protection regulations, and privacy statutes all apply to algorithmic decision-making—and pending legislation may establish explicit requirements as early as 2027.

Current Legal Framework for AI Hiring in Washington

Washington Law Against Discrimination (WLAD) - RCW 49.60

Washington's primary employment anti-discrimination statute prohibits discrimination based on race, sex, age, disability, religion, national origin, sexual orientation, gender identity, and other protected characteristics. WLAD applies to AI hiring tools just as it applies to human decision-making.

Key implications for AI hiring:

  • Disparate impact liability: If an AI tool disproportionately screens out candidates from protected classes, employers face liability under WLAD even if discrimination was unintentional. The Washington State Human Rights Commission (WSHRC) and courts apply disparate impact analysis similar to federal Title VII frameworks.
  • Reasonable accommodation: AI assessment tools must accommodate candidates with disabilities. If an automated video interview platform penalizes speech patterns associated with disabilities, employers violate WLAD and the Americans with Disabilities Act (ADA).
  • Employer responsibility: Employers cannot outsource liability to AI vendors. Using a third-party tool that produces discriminatory outcomes exposes you to WLAD violations.

Enforcement:

  • Washington State Human Rights Commission (WSHRC) investigates complaints
  • Private right of action—individuals can sue directly in state court
  • Remedies include back pay, front pay, compensatory damages, and attorney's fees
  • No statutory cap on damages (unlike federal Title VII)

My Health My Data Act (MHMDA) - Chapter 19.373 RCW

Enacted in 2023 and effective March 31, 2024, MHMDA is one of the nation's most comprehensive health data privacy laws. While primarily focused on health data, MHMDA's broad definition may capture certain AI hiring tools—particularly those analyzing biometric or health-related information.

When MHMDA applies to hiring:

  • Biometric health data: AI tools analyzing voice stress, facial expressions for emotional states, or physiological indicators (heart rate from video analysis) may collect "consumer health data" under MHMDA
  • Mental health inferences: Personality assessments or AI that infers psychological characteristics could fall within MHMDA's scope if they analyze or infer mental health conditions
  • Broad definition: MHMDA defines consumer health data as information "that identifies or is reasonably capable of being associated with a consumer and identifies the consumer's health status"—potentially capturing more than traditional medical records

MHMDA compliance requirements:

  • Consent before collection: Obtain consent before collecting health data from applicants
  • Privacy policy disclosure: Clearly disclose what health data is collected and how it's used
  • Geofencing restrictions: Prohibits geofencing around healthcare facilities to identify health status (less relevant for hiring but demonstrates law's breadth)
  • Sale prohibition: Cannot sell consumer health data
  • Security requirements: Must implement reasonable security measures

Enforcement:

  • Washington Attorney General has exclusive enforcement authority
  • Civil penalties up to $7,500 per violation
  • No private right of action

Consumer Protection Act (CPA) - Chapter 19.86 RCW

Washington's CPA is a broad consumer protection statute prohibiting unfair or deceptive practices. While not specific to employment, the Attorney General has authority to pursue CPA enforcement for deceptive AI practices—including failure to disclose AI use in hiring or making false claims about AI fairness.

Potential CPA violations in AI hiring:

  • Failing to disclose use of AI in hiring decisions
  • Misrepresenting AI tool capabilities or fairness
  • Collecting more data than disclosed in privacy policies
  • Using AI in ways inconsistent with vendor representations

Pending Legislation: What May Be Coming

Overview of Proposed Bills

Multiple AI-related bills were introduced in the 2024-2026 Washington legislative sessions. While none have been enacted as of February 2026, several proposals signal regulatory priorities:

Algorithmic Accountability and Transparency

Proposed legislation would require:

  • Impact assessments: Businesses deploying high-risk AI systems (including employment decisions) would conduct and document impact assessments evaluating potential discrimination, privacy risks, and accuracy
  • Consumer notifications: Notice to individuals before automated decision-making affects them
  • Right to explanation: Consumers could request information about how automated systems influenced decisions
  • Human review rights: Options to request human review of automated employment decisions

Biometric Privacy Protection

Following Illinois' BIPA model, proposed Washington legislation would regulate biometric data in employment:

  • Written consent before collecting biometric identifiers (facial geometry, voiceprints, fingerprints)
  • Disclosure of retention schedules and destruction timelines
  • Private right of action with statutory damages ($1,000-$5,000 per violation)
  • Prohibition on selling biometric data

AI Hiring Disclosure Requirements

Proposals modeled after NYC Local Law 144 would require:

  • Pre-use notification to applicants
  • Disclosure of what data is collected and analyzed
  • Explanation of how AI influences hiring decisions
  • Annual bias audits for automated employment decision tools
  • Public posting of audit results

Legislative Timeline

2024-2025: Multiple AI bills introduced; none advanced to governor's desk. Committee hearings highlighted employer concerns about compliance burdens and vendor reluctance to provide bias audit data.

2026 Session (ongoing): Revised proposals with narrower scope and longer implementation timelines are under consideration. Industry groups and labor advocates continue negotiations.

Likely timeline if enacted: 12-18 month implementation period after passage, meaning effective dates in 2027 or later.

Monitor Legislative Developments

Washington employers should actively monitor legislative activity. Bills can move quickly once momentum builds. Subscribe to Washington State Legislature bill tracking for HB and SB proposals containing "artificial intelligence," "automated decision," or "algorithmic."

Key committees to watch: House Labor & Workplace Standards, Senate Labor & Commerce, House Innovation, Community & Economic Development, and Technology & Economic Development

Best Practices for Washington Employers (Even Without Specific Law)

1. Conduct Voluntary Bias Testing

Even without a legal mandate, proactively test AI tools for disparate impact. This serves multiple purposes:

  • WLAD compliance: Demonstrates due diligence if discriminatory outcomes are challenged
  • Federal compliance: Aligns with EEOC expectations for selection procedures
  • Preparedness: If Washington enacts audit requirements, you're already compliant
  • Risk mitigation: Identifies problems before they result in complaints or lawsuits

Recommended testing approach:

  • Annual analysis of selection rates by race/ethnicity and sex/gender
  • Calculate impact ratios (compare selection rate of each group to highest-performing group)
  • Investigate any ratio below 0.80 (four-fifths rule threshold)
  • Document findings and corrective actions

2. Provide Transparent Disclosures

Disclose AI use even if not legally required:

Sample Washington AI Hiring Disclosure:

Use of Automated Technology in Hiring

[Company Name] uses automated decision-making technology to assist in evaluating job applications. This includes software that analyzes resumes, scores assessment responses, and ranks candidates based on qualifications and job fit.

What This Means for You:

  • Your application materials may be analyzed by algorithms that identify relevant skills and experience
  • Assessment responses may be automatically scored and compared to job requirements
  • Automated outputs help recruiters prioritize candidates, but humans make final hiring decisions

If you have questions about our use of automated technology or believe it has affected your application unfairly, contact [hr@company.com] or [phone number]. You may request human review of any automated decision.

3. Implement Human Oversight

Maintain meaningful human involvement in hiring decisions:

  • AI can screen, score, or recommend—but humans make final decisions
  • Recruiters must have authority to override AI recommendations
  • Document when and why AI recommendations are overridden
  • Train staff to recognize potential AI bias

4. Vendor Due Diligence

Thoroughly vet AI hiring vendors:

  • Request bias testing results and methodologies
  • Verify WLAD and ADA compliance claims
  • Require contractual commitments to notify you of model changes
  • Ensure vendor cooperation with any future audits or investigations
  • Obtain indemnification for vendor-caused discrimination (though this doesn't eliminate your liability)

5. Document Compliance Efforts

Create a paper trail demonstrating good faith:

  • Maintain records of bias testing
  • Document vendor assessments and selection criteria
  • Track candidate notifications and disclosures
  • Preserve evidence of human oversight
  • Retain records for at least 3 years (federal EEOC standard)

Comparison: Washington vs. Other States

StateStatusKey Requirements
WashingtonNo comprehensive law; proposals pendingWLAD anti-discrimination, MHMDA health data privacy, voluntary best practices
IllinoisEnacted (HB 3773, effective Jan 2026)Pre-use notice, non-discrimination requirement, regular assessments
ColoradoEnacted (SB24-205, effective Feb 2026)Impact assessments, disclosures, opt-out rights, appeal process
CaliforniaCCPA ADMT regulations (effective 2026-2027)Pre-use notice, opt-out, risk assessments, CPPA submissions
NYCEnacted (Local Law 144, effective 2023)Annual bias audits, public posting, 10-day advance notice

Washington employers benefit from observing other states' implementation challenges. Common issues include:

  • Vendor reluctance to provide audit data or bias testing results
  • Difficulty defining what qualifies as "AI" or "automated decision-making"
  • Challenges establishing alternative processes for opt-out requests
  • Resource constraints for small employers

Practical Compliance Roadmap for Washington Employers

Immediate Actions (Now)

  • ☐ Inventory all AI and automated tools used in hiring
  • ☐ Review tools for potential WLAD discrimination risks
  • ☐ Assess whether tools collect health data under MHMDA
  • ☐ Implement candidate disclosures (even if not legally required)
  • ☐ Conduct voluntary bias testing
  • ☐ Document human oversight processes
  • ☐ Train HR staff on AI risks and compliance

If/When Legislation Passes

  • ☐ Review specific requirements and effective dates
  • ☐ Update disclosures to match statutory language
  • ☐ Commission independent bias audits if required
  • ☐ Complete impact assessments
  • ☐ Establish opt-out and appeal processes
  • ☐ Update vendor contracts with compliance terms
  • ☐ Refresh HR training on new requirements

Ongoing

  • ☐ Monitor legislative developments quarterly
  • ☐ Review selection rate data quarterly
  • ☐ Annual bias testing
  • ☐ Update disclosures when tools change
  • ☐ Maintain compliance documentation

Key Takeaways for Washington Employers

  • No comprehensive AI hiring law yet, but existing anti-discrimination and privacy protections apply to AI tools
  • WLAD liability risk is real — disparate impact from AI can trigger complaints and lawsuits
  • My Health My Data Act may apply if AI analyzes health-related biometric data
  • Pending legislation could establish explicit requirements as early as 2027
  • Proactive compliance (voluntary bias testing, transparent disclosures) reduces legal risk and prepares for future regulations
  • Monitor other states — Washington proposals borrow heavily from Illinois, Colorado, and NYC frameworks
  • Vendor due diligence is critical — you can't outsource liability for discriminatory AI tools

Related Resources

Stay Ahead of Washington AI Regulations

Even without comprehensive legislation, Washington employers using AI in hiring face compliance obligations under existing laws. Take our free compliance scorecard to understand your risks and get actionable recommendations.

Disclaimer: This content is for informational purposes only and does not constitute legal advice. Employment laws vary by jurisdiction and change frequently. Consult a qualified employment attorney for guidance specific to your situation. EmployArmor provides compliance tools and resources but is not a law firm.

Ready to get compliant?

Take our free 2-minute assessment to see where you stand.