State Law16 min readFebruary 23, 2026

Texas AI Hiring Law (TRAIGA): Complete 2026 Compliance Guide

Texas enacts TRAIGA (Responsible Artificial Intelligence Governance Act) effective January 1, 2026. Learn the intent-based discrimination prohibition, CUBI biometric requirements, and Texas-specific compliance obligations.

DB
Devyn Bartell
Founder & CEO, EmployArmor
Published February 23, 2026

On January 1, 2026, Texas became one of the first states to regulate artificial intelligence in employment with the passage of the Texas Responsible Artificial Intelligence Governance Act (TRAIGA). Unlike other state laws that focus on transparency and bias audits, Texas takes a unique approach: prohibiting intentional discrimination through AI systems while maintaining a business-friendly regulatory environment that avoids prescriptive audit and disclosure mandates.

For Texas employers, TRAIGA adds a new layer of compliance obligations on top of existing requirements under:

  • Texas Labor Code Chapter 21 — General employment discrimination protections
  • Texas CUBI (Capture or Use of Biometric Identifier Act) — Biometric data privacy
  • Federal law — Title VII, ADA, ADEA, and EEOC guidance on AI

This comprehensive guide covers everything Texas employers need to know about AI hiring compliance in 2026.

Texas AI Employment Law at a Glance (Effective January 1, 2026):

  • TRAIGA: Prohibits intentional AI discrimination (all employers)
  • No bias audit mandate (unlike NYC or Colorado)
  • No AI disclosure requirement for private employers
  • Intent-based standard: Disparate impact alone doesn't prove violation
  • Texas AG enforcement: Exclusive enforcement authority
  • CUBI compliance: Biometric consent required (facial recognition, voice analysis)
  • Chapter 21 applies: Traditional discrimination law covers AI outcomes

Texas Responsible Artificial Intelligence Governance Act (TRAIGA)

Legislative Background

TRAIGA was passed by the Texas Legislature and signed into law in 2025, effective January 1, 2026. The law positions Texas as a leader in balancing AI innovation with civil rights protection. Unlike more prescriptive state laws, TRAIGA focuses on intent-based discrimination prohibition without imposing burdensome compliance requirements like mandatory bias audits or public disclosures.

The law emerged from bipartisan concerns about algorithmic bias while respecting Texas's pro-business regulatory philosophy. Texas employers now face clear anti-discrimination obligations without the administrative overhead seen in states like New York or Colorado.

Who Is Covered by TRAIGA?

TRAIGA applies to all employers operating in Texas, regardless of size. This is a significant departure from federal law (Title VII) and Texas Labor Code Chapter 21, which generally apply to employers with 15 or more employees.

Covered entities include:

  • Private employers of any size hiring or managing employees in Texas
  • Staffing agencies and third-party recruiters
  • Out-of-state employers hiring Texas residents or making employment decisions affecting Texas workers
  • AI developers and vendors deploying systems used for Texas employment decisions
  • Gig economy platforms and contractor management systems (if employment relationship exists)

Not covered (exemptions):

  • Compliant financial institutions and insurance companies (regulated separately)
  • Federal government agencies (governed by federal AI procurement rules)

Core Prohibition: Intentional AI Discrimination

TRAIGA prohibits any person from "developing or deploying an AI system with the intent to unlawfully discriminate" against individuals based on protected characteristics in employment decisions.

Protected Classes Under TRAIGA:

  • Race and color
  • National origin and ancestry
  • Sex (including pregnancy, sexual orientation, gender identity)
  • Religion and creed
  • Age (40 and over under ADEA; all ages under Texas law)
  • Disability (physical and mental)
  • Genetic information

Covered Employment Decisions:

  • Recruitment and job advertising
  • Application screening and resume review
  • Interviewing and candidate assessment
  • Hiring and selection
  • Promotion and advancement
  • Training and professional development opportunities
  • Performance evaluation and review
  • Compensation and benefits decisions
  • Discipline, demotion, and corrective action
  • Termination and discharge
  • Terms, conditions, and privileges of employment

The Intent Requirement: What It Means

TRAIGA's most distinctive feature is its intent-based standard. Unlike laws that prohibit disparate impact (outcomes-based liability), TRAIGA requires proof of intentional discrimination.

What Proves Intent?

  • Design choices: Explicitly programming AI to weight protected characteristics negatively
  • Known bias: Deploying AI after discovering it discriminates and choosing not to fix it
  • Proxy variables: Intentionally using proxies (zip code, school names) to discriminate
  • Deliberate indifference: Recklessly ignoring obvious discriminatory outcomes
  • Documentation: Internal communications showing discriminatory intent

What Doesn't Prove Intent (Under TRAIGA Alone)?

  • Disparate impact alone: Statistical evidence showing AI disproportionately affects protected groups
  • Vendor bias: Third-party AI tools with unintended bias (if employer acts in good faith)
  • Historical data bias: AI reflecting past discrimination in training data without employer knowledge
  • Unintended outcomes: Results that correlate with protected classes despite neutral design

Critical Distinction: TRAIGA vs. Chapter 21

While TRAIGA requires intent, Texas Labor Code Chapter 21 (for 15+ employee employers) and federal law still allow disparate impact claims without proving intent.

Bottom line: Even if you're not violating TRAIGA, AI that causes disparate impact can still violate Chapter 21, Title VII, or EEOC guidelines. Don't assume the intent standard protects you from all liability.

Biometric Data Prohibition

TRAIGA separately prohibits AI systems from capturing biometric data (facial recognition, voice analysis, fingerprints, retina scans) without obtaining consent. This overlaps with and reinforces Texas CUBI requirements (discussed below).

Enforcement and Penalties

Texas Attorney General (Exclusive Enforcement)

The Texas Attorney General has exclusive authority to enforce TRAIGA. There is no private right of action under TRAIGA itself, meaning individuals cannot sue directly for TRAIGA violations (though they can still sue under Chapter 21, federal law, or CUBI).

The AG may:

  • Issue civil investigative demands requiring documents, data, and testimony
  • Conduct investigations into alleged intentional AI discrimination
  • Seek injunctive relief to stop discriminatory AI deployment
  • Impose civil penalties (amounts not yet specified in published guidance)
  • Negotiate consent decrees requiring corrective action and monitoring

No Private Lawsuits Under TRAIGA

Unlike many state AI laws, TRAIGA does not create a new private cause of action. This means:

  • Individuals cannot sue employers directly for TRAIGA violations
  • Class action lawsuits based solely on TRAIGA are not possible
  • Enforcement is limited to state action by the AG

However, affected individuals can still sue under:

  • Texas Labor Code Chapter 21 (traditional discrimination claims)
  • Federal laws (Title VII, ADA, ADEA)
  • Texas CUBI (biometric data violations with private right of action)

Regulatory Sandbox for AI Innovation

TRAIGA includes a regulatory sandbox program allowing companies to test innovative AI systems in employment with temporary regulatory relief. This program aims to foster AI development while ensuring safeguards are in place.

Sandbox participants receive:

  • Limited immunity from certain regulatory requirements during testing periods
  • Technical assistance from state agencies
  • Expedited approval processes for compliant AI systems

Contact the Texas Workforce Commission or Attorney General's office for sandbox application details.

Texas Labor Code Chapter 21: Traditional Discrimination Law Applies to AI

Even before TRAIGA, Texas Labor Code Chapter 21 (the state's equivalent to federal Title VII) has applied to AI-driven employment decisions. Chapter 21 remains the primary vehicle for individual discrimination claims.

Who Is Covered by Chapter 21?

Chapter 21 applies to employers with 15 or more employees, covering:

  • Private employers
  • State and local government agencies
  • Labor unions and employment agencies

Protected Classes

Chapter 21 prohibits discrimination based on:

  • Race, color, national origin
  • Religion
  • Sex (including pregnancy)
  • Age
  • Disability
  • Genetic information

How Chapter 21 Applies to AI Hiring

Disparate Treatment

If an employer intentionally uses AI to discriminate (e.g., programming AI to downweight female candidates), this violates Chapter 21 as disparate treatment.

Disparate Impact

Even without intent, if AI produces statistically significant adverse impact on protected groups, it may violate Chapter 21. Employers must show AI tools are:

  • Job-related: AI criteria actually predict job performance
  • Business necessity: No equally valid alternative with less discriminatory impact
  • Validated: Supported by proper validation studies

Employer Liability for AI Vendors

Texas courts have consistently held that employers are responsible for discriminatory outcomes from third-party vendors, including AI tools. You cannot delegate liability to vendors—even if the vendor created the AI, the employer is accountable for its use.

Texas Workforce Commission (TWC) Enforcement

The Texas Workforce Commission Civil Rights Division investigates and enforces Chapter 21 complaints. The process parallels EEOC procedures:

  1. Charge Filing: Individual files discrimination charge with TWC (180-day deadline, extendable to 300 days if also filed with EEOC)
  2. Investigation: TWC investigates, requesting documents and interviewing witnesses
  3. Mediation: TWC offers voluntary mediation
  4. Determination: TWC issues "Cause" or "No Cause" finding
  5. Right to Sue: If cause found or TWC doesn't resolve, individual receives right-to-sue letter
  6. Lawsuit: Individual can sue in Texas state court within 2 years

Remedies Under Chapter 21

Successful plaintiffs can recover:

  • Back pay and benefits: Lost wages from date of discrimination
  • Front pay: Future lost earnings if reinstatement isn't feasible
  • Compensatory damages: Emotional distress, mental anguish, loss of reputation (capped at $300,000 for employers with 500+ employees)
  • Punitive damages: Available for intentional or reckless violations
  • Injunctive relief: Court orders to change practices, implement monitoring, provide training
  • Attorney's fees and costs: Prevailing plaintiffs recover legal expenses

Texas CUBI: Biometric Data Privacy Law

The Capture or Use of Biometric Identifier Act (CUBI), codified at Texas Business & Commerce Code §503.001 et seq., regulates the collection, storage, and use of biometric identifiers in all contexts, including employment.

What Is a "Biometric Identifier" Under CUBI?

CUBI defines biometric identifiers as:

  • Fingerprints
  • Retina or iris scans
  • Voiceprints (voice pattern analysis)
  • Facial geometry (measurements of facial features)
  • Hand or palm scans
  • Gait or behavioral patterns

Not covered: Photographs, video recordings, or audio recordings that are not analyzed for biometric patterns (e.g., a simple video interview recording without facial analysis software).

When CUBI Applies to AI Hiring Tools

Common AI Tools Triggering CUBI:

  • HireVue: Facial expression analysis, speech pattern evaluation
  • Modern Hire: Video interview analysis with emotion detection
  • Pymetrics: Behavioral biometric game assessments
  • VidCruiter: Facial recognition for identity verification
  • Voice AI tools: Accent analysis, tone evaluation, speech stress detection
  • Background check services: Fingerprint-based identity verification

CUBI Requirements

1. Advance Notice

Before collecting biometric data, employers must inform candidates:

  • That biometric data will be collected
  • The specific type of biometric identifier (facial geometry, voiceprint, etc.)
  • The purpose for collection
  • How long the data will be retained

2. Written Consent

Obtain affirmative written consent before capturing or using biometric identifiers. Consent must:

  • Be separate from general terms of service or application acknowledgments
  • Clearly identify what biometric data is collected
  • Allow the individual to opt in (not opt out)
  • Be documented with date and method of consent

3. Reasonable Care in Storage and Transmission

CUBI requires "reasonable care" to protect biometric data from unauthorized access. Best practices include:

  • Encryption: At-rest and in-transit encryption
  • Access controls: Limit who can view biometric data
  • Secure deletion: Permanent deletion protocols when retention period expires
  • Vendor security: Ensure third-party AI providers meet CUBI standards
  • Breach notification: Plan for responding to biometric data breaches

4. No Sale or Disclosure Without Consent

Biometric data cannot be sold, leased, or disclosed to third parties without explicit consent, except:

  • To complete the purpose for which it was collected (e.g., sharing with AI vendor for analysis)
  • In response to court orders, subpoenas, or search warrants
  • With written consent from the individual

CUBI Penalties and Enforcement

Civil Penalties

  • Up to $25,000 per violation for intentional or reckless violations
  • Penalties apply per individual affected (e.g., $25,000 x 100 applicants = $2.5 million exposure)

Private Right of Action

Unlike TRAIGA, CUBI provides a private right of action. Individuals can sue directly for:

  • Actual damages: Proven economic or emotional harm
  • Injunctive relief: Court orders to stop collection or delete data
  • Attorney's fees: Prevailing plaintiffs recover legal costs

No statutory damages: Unlike Illinois BIPA (which provides $1,000-$5,000 per violation), CUBI requires proof of actual harm for damage awards beyond the $25,000 civil penalty.

CUBI vs. Illinois BIPA: Key Differences

AspectTexas CUBIIllinois BIPA
NoticeRequired before collectionRequired with detailed written policy
ConsentWritten consent requiredWritten release required
RetentionDisclose retention periodMust have written retention schedule and destruction guidelines
Storage"Reasonable care""Same or more protective than other confidential information"
DamagesActual damages only$1,000 per negligent violation, $5,000 per intentional/reckless
Civil PenaltyUp to $25,000 (intentional/reckless)None (statutory damages apply)
EnforcementPrivate lawsuits for actual damages + $25K penaltyPrivate lawsuits for statutory damages ($1K-$5K per violation)

Practical takeaway: If you're BIPA-compliant, you likely satisfy CUBI. However, CUBI's "reasonable care" standard may be more flexible than BIPA's stringent requirements.

Sample CUBI-Compliant Biometric Consent Form

Biometric Information Collection Notice and Consent

[Company Name] uses video interview technology provided by [Vendor Name] that collects and analyzes biometric identifiers as described below.

What Biometric Data We Collect:

  • Facial geometry (measurements and analysis of facial features)
  • Voiceprint (speech patterns, tone, pitch, and cadence)
  • [Add other biometric data types if applicable]

Purpose of Collection:

We collect this biometric data to evaluate your communication skills, presentation, and suitability for the role during the video interview process. The AI system analyzes your responses to provide our hiring team with standardized assessment metrics.

Retention Period:

Your biometric data will be retained for [specify period, e.g., "the duration of the hiring process plus 3 years," "6 months following completion of the hiring process"]. After this period, we will permanently delete your biometric data unless we notify you and obtain additional consent for extended retention.

Data Protection:

We store biometric data with reasonable security measures including encryption, access controls, and secure transmission protocols. We will not sell, lease, trade, or otherwise disclose your biometric identifiers to third parties without your explicit consent, except as necessary to complete the purpose described above or as required by law.

Your Rights:

  • You may request deletion of your biometric data at any time
  • You may withdraw consent, though this may affect your ability to participate in our hiring process
  • You have the right to request information about how we use your biometric data

Your Consent:

By checking the box below and signing, you acknowledge that you have read and understood this notice and consent to [Company Name]'s collection, use, and storage of your biometric identifiers as described above.

I consent to [Company Name]'s collection and use of my biometric identifiers.

Candidate Signature: _________________________ Date: ____________

Multi-State Compliance: Texas in Context

Comparison: Texas vs. Major State AI Laws

RequirementTexas (TRAIGA)Illinois (HB 3773)NYC (Law 144)Colorado (SB 205)
ScopeAll employersAll employersNYC employers onlyHigh-risk AI systems
StandardIntent requiredDisparate impactDisparate impactDisparate impact
AI DisclosureNot requiredRequiredRequired (10 days)Required
Bias AuditNot requiredNot requiredAnnual requiredImpact assessments
Biometric LawCUBI (consent)BIPA (strict)None (general)CPA (consent)
EnforcementAG only (TRAIGA)IDHR + privateNYC CHRAG + private (2029)
Private ActionNo (TRAIGA) / Yes (Ch. 21, CUBI)Yes (IHRA)NoYes (2029)
Effective DateJan 1, 2026Jan 1, 2026In effect (2023)Feb 1, 2026

Texas's Unique Position

Texas distinguishes itself by:

  • Light-touch regulation: No audits, no mandatory disclosures (for TRAIGA)
  • Intent-based standard: Higher burden of proof protects employers from outcomes-based liability under TRAIGA
  • Pro-innovation: Regulatory sandbox encourages AI development
  • AG-only enforcement: Reduces litigation risk compared to private-action states

However, don't mistake this for weak enforcement—Chapter 21 and federal law still apply full disparate impact standards, and CUBI creates significant biometric liability.

Practical Compliance Guide for Texas Employers

Step 1: Conduct AI Inventory (Deadline: Before January 1, 2026)

What to Document:

  • Tool name and vendor
  • AI functionality: What does the AI actually do? (screen, score, analyze, predict)
  • Employment decisions affected: Hiring, promotion, performance management, termination
  • Biometric data use: Does it capture facial recognition, voice analysis, or other biometrics?
  • Data inputs: What information does the AI process?
  • Output type: Scores, rankings, recommendations, or automated decisions?
  • Human review: How do humans use AI output in final decisions?

Common Tools Requiring Review:

Step 2: Implement CUBI Compliance for Biometric Tools

For Each Biometric AI Tool:

  1. Create biometric notice: Draft clear explanation of what biometric data is collected and why
  2. Design consent form: Separate, explicit opt-in consent (use sample above as template)
  3. Integrate into workflow: Present notice and collect consent before AI use
  4. Document consent: Store signed consent forms with timestamps
  5. Implement security: Encrypt biometric data, restrict access, secure vendor transmission
  6. Establish retention policy: Define deletion timelines and procedures
  7. Train staff: Ensure HR knows when and how to obtain biometric consent

Step 3: Mitigate Intentional Discrimination Risk (TRAIGA Compliance)

Documentation Review:

  • Audit internal communications: Review emails, Slack messages, meeting notes discussing AI tool selection or configuration
  • Remove problematic language: Eliminate references to protected characteristics in AI requirements
  • Document legitimate criteria: Clearly articulate job-related reasons for AI design choices

AI Configuration Review:

  • Check for protected class inputs: Ensure AI doesn't directly consider race, sex, age, etc.
  • Evaluate proxy variables: Review whether zip code, school names, or other proxies correlate with protected classes
  • Test for bias: Even though not required under TRAIGA, testing helps avoid Chapter 21/federal liability

Vendor Due Diligence:

  • Request vendor bias testing reports
  • Review vendor compliance documentation
  • Include TRAIGA and CUBI compliance obligations in vendor contracts
  • Require vendors to notify you of algorithm changes

Step 4: Prepare for Chapter 21 Disparate Impact Claims

Even if you comply with TRAIGA's intent standard, protect against traditional discrimination claims:

Voluntary Bias Testing:

  • Analyze selection rates: Compare AI outcomes by race, sex, age (if data available)
  • Calculate impact ratios: Use four-fifths rule (80% benchmark)
  • Statistical significance testing: Confirm whether differences are statistically meaningful
  • Document remediation: If bias found, document corrective action taken

Job-Relatedness Validation:

  • Ensure AI criteria predict actual job performance
  • Conduct or obtain validation studies (content, criterion, or construct validity)
  • Document business justification for AI use
  • Explore less discriminatory alternatives if disparate impact found

Human Oversight:

  • Never allow AI to make final decisions without human review
  • Train reviewers to override biased AI recommendations
  • Document when humans deviate from AI output (and why)

Step 5: Establish Recordkeeping Systems

Records to Maintain:

  • AI inventory and tool documentation
  • CUBI biometric consents (with timestamps and versions)
  • Vendor contracts and compliance documentation
  • Bias testing reports and remediation efforts
  • AI design and configuration documentation
  • Training records for HR staff on AI compliance
  • Candidate inquiries about AI use and employer responses
  • System change logs documenting AI updates or modifications

Retention Periods:

  • 3 years minimum: Align with federal recordkeeping requirements
  • CUBI consents: Retain for duration of retention period disclosed + 3 years
  • Litigation hold: Preserve all records if complaints filed

Step 6: Train HR and Hiring Teams

Training Topics:

  • TRAIGA overview: What intentional AI discrimination means
  • CUBI requirements: When and how to obtain biometric consent
  • Chapter 21 basics: Traditional discrimination law still applies
  • Tool identification: Which AI systems require compliance measures
  • Candidate communications: How to explain AI use when asked
  • Escalation procedures: When to involve legal/compliance teams

Roles to Train:

  • HR generalists and recruiters
  • Hiring managers and interview panelists
  • Talent acquisition specialists
  • IT/HR systems administrators
  • Third-party recruiters or staffing partners

Step 7: Monitor for Compliance and Regulatory Updates

Ongoing Monitoring:

  • Quarterly AI audits: Review tools for bias and compliance
  • Vendor check-ins: Confirm continued compliance, request updated testing
  • Regulatory tracking: Monitor Texas AG guidance, TWC updates, court decisions
  • Policy updates: Revise AI policies as laws or tools change

Key Resources to Monitor:

Frequently Asked Questions

Do small businesses (under 15 employees) have to comply with TRAIGA?

Yes. TRAIGA applies to all employers, regardless of size. This is different from Chapter 21 (which applies to 15+ employee employers). Even a 2-person startup using AI in hiring must comply with TRAIGA's prohibition on intentional discrimination and CUBI's biometric requirements.

What's the difference between TRAIGA and Chapter 21 when it comes to AI?

TRAIGA: Prohibits intentional AI discrimination. Requires proof that the employer designed or deployed AI with the intent to discriminate. Applies to all employers. Enforced by AG only.

Chapter 21: Prohibits both intentional discrimination (disparate treatment) andoutcomes-based discrimination (disparate impact). Applies to 15+ employee employers. Enforced via TWC investigation + private lawsuits.

Bottom line: You can comply with TRAIGA (no intent to discriminate) but still violate Chapter 21 if your AI produces biased outcomes.

If my AI vendor provides bias audits, am I protected?

Vendor audits are helpful but don't eliminate your liability. Under Texas law, employers are responsible for discriminatory outcomes even if a third-party vendor created the AI. Use vendor audits as part of due diligence, but conduct your own internal monitoring and validation.

Do I need to disclose AI use to candidates under TRAIGA?

No, TRAIGA does not require disclosure for private employers. However:

  • If using biometric AI, you must provide CUBI notice and obtain consent
  • Voluntary disclosure is best practice (builds trust, prepares for future regulation)
  • If you also operate in Illinois, NYC, or Colorado, those states' disclosure requirements apply

Can applicants sue me directly under TRAIGA?

No. TRAIGA does not provide a private right of action—only the Texas Attorney General can enforce it. However, applicants can still sue under:

  • Texas Labor Code Chapter 21 (discrimination claims)
  • Federal law (Title VII, ADA, ADEA)
  • Texas CUBI (biometric violations)

How does the AG prove "intent" under TRAIGA?

The AG may use:

  • Direct evidence: Internal documents, emails, meeting notes showing discriminatory purpose
  • AI design: Explicit use of protected characteristics or proxies
  • Pattern evidence: Repeated discriminatory outcomes despite knowledge of bias
  • Reckless indifference: Ignoring obvious discriminatory patterns

Courts haven't yet interpreted TRAIGA's intent standard, but expect it to be similar to federal intentional discrimination standards.

What if I discover my AI has been producing biased outcomes?

Take immediate action:

  1. Stop using the tool until bias is addressed
  2. Document the discovery and corrective steps (shows good faith)
  3. Notify legal counsel to assess exposure
  4. Review past decisions to identify affected individuals
  5. Work with vendor to fix algorithms or switch providers
  6. Consider remediation for impacted applicants (e.g., reconsidering rejections)
  7. Update monitoring to catch future issues earlier

Discovering and fixing bias proactively demonstrates lack of intent (helps TRAIGA defense) and shows good faith (helps Chapter 21/federal defenses).

Do I need a lawyer to comply with Texas AI laws?

Not necessarily, but legal counsel is recommended if:

  • You're implementing high-risk AI (auto-reject tools, video analysis)
  • You operate in multiple states with conflicting AI laws
  • You receive a discrimination complaint or AG inquiry
  • Your AI vendor cannot provide adequate compliance documentation
  • You're unsure whether specific tools trigger CUBI biometric requirements

Should I wait for Texas AG guidance before implementing compliance measures?

No. TRAIGA is effective January 1, 2026, regardless of whether detailed AG guidance exists. Implement baseline compliance now:

  • Audit AI tools for intentional discrimination risk
  • Implement CUBI biometric consent processes
  • Conduct voluntary bias testing
  • Document good-faith compliance efforts

You can refine processes as AG guidance emerges, but don't delay basic compliance.

Can I participate in the TRAIGA regulatory sandbox?

Yes, if you're developing or testing innovative AI systems for employment. The sandbox provides:

  • Temporary regulatory relief during testing periods
  • Technical assistance from state agencies
  • Expedited approval processes

Contact the Texas Workforce Commission or Attorney General's office for application details. The sandbox is designed for companies actively innovating in AI, not for large-scale deployment of existing tools.

What about gig workers and independent contractors—does TRAIGA apply?

TRAIGA applies to "employment decisions," which could include contractor relationships if there's an employment-like relationship. Texas courts use multi-factor tests to determine employment status. If you use AI to select, evaluate, or terminate gig workers, consult counsel to assess whether TRAIGA applies.

I'm already compliant with Illinois BIPA. Does that satisfy Texas CUBI?

Generally yes, but with nuances:

  • Notice and consent: BIPA requirements exceed CUBI—if compliant with BIPA, CUBI is satisfied
  • Retention schedules: BIPA requires written schedules; CUBI requires disclosure but not formal policy
  • Security: BIPA's "same or more protective" standard is stricter than CUBI's "reasonable care"
  • Damages: BIPA provides statutory damages ($1K-$5K); CUBI requires actual harm for damages

Recommendation: If you have a BIPA compliance program, add Texas-specific documentation (CUBI consent forms referencing Texas law) but keep the same substantive processes.

Compliance Checklist for Texas Employers

Before January 1, 2026:

  • ☐ Complete AI inventory (all tools used in employment decisions)
  • ☐ Identify biometric AI tools and draft CUBI consent forms
  • ☐ Audit AI tools for intentional discrimination risk (TRAIGA)
  • ☐ Conduct voluntary bias testing (Chapter 21/federal mitigation)
  • ☐ Implement biometric notice and consent workflow
  • ☐ Update vendor contracts with TRAIGA/CUBI compliance obligations
  • ☐ Train HR staff on Texas AI laws (TRAIGA, CUBI, Chapter 21)
  • ☐ Establish recordkeeping systems for AI documentation
  • ☐ Review internal communications for problematic language
  • ☐ Designate compliance owner for AI employment practices

After January 1, 2026 (Ongoing):

  • ☐ Obtain CUBI biometric consent before using facial recognition/voice AI
  • ☐ Document all biometric consents with timestamps
  • ☐ Monitor AI outcomes for bias (quarterly reviews recommended)
  • ☐ Update AI inventory when new tools deployed
  • ☐ Track Texas AG guidance and enforcement actions
  • ☐ Respond promptly to candidate inquiries about AI use
  • ☐ Audit vendor compliance annually
  • ☐ Update training materials as regulations evolve
  • ☐ Maintain comprehensive AI documentation (3+ year retention)
  • ☐ Consider regulatory sandbox participation for innovative AI

Related Resources

How EmployArmor Helps Texas Employers

EmployArmor provides end-to-end Texas AI compliance automation:

  • AI inventory management: Identify and track all employment AI tools
  • CUBI consent automation: Generate, deliver, and document biometric consents
  • Bias monitoring: Proactive disparate impact analysis (mitigates Chapter 21 risk)
  • TRAIGA compliance: Document good-faith efforts and absence of discriminatory intent
  • Multi-state support: Coordinate Texas requirements with IL, CO, NYC, and federal obligations
  • Vendor management: Track vendor compliance, audit reports, and contract obligations
  • Regulatory monitoring: Real-time alerts on Texas AG guidance and enforcement
  • Audit-ready reporting: Generate compliance documentation for investigations or audits

Hiring in Texas? Ensure TRAIGA Compliance

Get Your Free Texas Compliance Assessment →

Disclaimer: This content is for informational purposes only and does not constitute legal advice. Employment laws vary by jurisdiction and change frequently. Consult a qualified employment attorney for guidance specific to your situation. EmployArmor provides compliance tools and resources but is not a law firm.

Ready to get compliant?

Take our free 2-minute assessment to see where you stand.