By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.
8
min read
Published on

Disclo vs. “Black-Box” ADS Tools: Staying Human-Centered — and FEHA-Compliant — in the Age of AI

Updated on

Table of contents

Imagine this scenario: a talented applicant with a minor hand tremor completes a mandatory online typing test for a promising new job. An automated-decision system (ADS) processes the results and, without human review, flags the candidate’s score as “below threshold,” effectively ending their application. No one ever asks why the score was low or if an accommodation was needed. Under newly finalized amendments to California’s Fair Employment and Housing Act (FEHA), this silent, automated rejection could expose an employer to significant legal liability.

The new regulations, which take effect on October 1, 2025, place direct responsibility on employers for the automated tools they use. They signal a clear mandate from regulators: technology can assist, but it cannot replace human oversight, especially when it comes to disability and equal opportunity. This is precisely why we created Disclo—not to make decisions, but to facilitate a transparent, structured, and human-guided accommodation workflow when it matters most.

What the FEHA now defines as an ADS

The amended FEHA casts a wide net, defining an "automated-decision system" as any computational process that uses machine learning, AI, or statistical modeling to make decisions or facilitate human decision-making. This includes a broad range of tools that have become common in modern recruitment:

  • Résumé screeners that parse and rank candidates.
  • Chatbots that conduct initial interviews.
  • Video interview analysis software that scores facial expressions or vocal tone.
  • Gamified assessments that measure cognitive ability or reaction time.

If these tools are used to screen, hire, or promote, they are considered an ADS under the law.

The hidden risks of "black-box" algorithms

Many ADS tools operate as "black boxes," where the internal logic is proprietary and opaque. Employers may not know precisely why a candidate was rejected, making it nearly impossible to defend against claims of discrimination. These systems can inadvertently penalize individuals with disabilities. For example:

  • An AI analyzing vocal tone may misinterpret the speech patterns of a neurodivergent individual.
  • A reaction-time game could unfairly disadvantage someone with a motor impairment.
  • A system requiring prolonged screen focus might screen out an applicant with a vision-related disability.

Under the new FEHA rules, relying on such a tool without offering an alternative could be a costly mistake.

The new accommodation mandate: A proactive approach

A cornerstone of the updated regulations is the requirement for proactive accommodation. If an ADS evaluates traits that could be affected by a disability, employers are now obligated to provide a reasonable accommodation or a non-AI alternative before using the ADS to make a decision. This shifts the burden onto the employer to create an "off-ramp" for applicants who may be unfairly assessed by an algorithm.

This is where the distinction between a decision-making ADS and a workflow tool becomes critical.

Side-by-side: ADS tools vs. Disclo

Feature Typical “Black-Box” ADS Platform Disclo’s Human-Centered Platform
Primary function Automates and makes decisions (hire/reject, promote/pass). Automates documentation and workflow; humans make all final decisions.
Transparency Opaque scoring; logic is often a trade secret. Provides a full, visible audit trail for managers and auditors.
Accommodation pathway Often an afterthought or completely absent. Built-in from the start; employees can request assistance at any point.
Bias mitigation Relies on vendor-provided, often unclear, bias testing. No algorithmic scoring based on disability; designed for human review.
Record retention Varies by vendor; the employer is responsible for capturing outputs. Immutable logs are retained for over 4 years, exceeding FEHA’s new rule.

Why this distinction is your best defense

Liability for ADS-driven discrimination falls on both the employer and the vendor. By choosing a workflow management tool like Disclo, which is not an ADS, employers can dramatically limit their legal exposure. Disclo provides a centralized, efficient system for managing accommodation requests without handing over the critical decisions that impact people's livelihoods to an algorithm. You get the benefits of streamlined processes without the risks of automated bias.

Action steps for HR and legal teams before October 1

  1. Inventory your HR tech stack: Go through every tool used in your hiring and promotion processes. Clearly label each one as either a decision-making ADS or a workflow-assistance tool.
  2. Create accommodation "off-ramps": For every tool identified as a true ADS, you must establish and communicate a clear path for applicants to request an accommodation or an alternative assessment. This process must be in place by October 1, 2025.
  3. Prioritize documentation: FEHA now mandates a four-year retention period for all records related to ADS use and associated personnel decisions. This includes the data used by the system and its outputs. Disclo’s platform is designed to meet this requirement automatically.

Conclusion

The message from California’s regulators is unambiguous: technology is a powerful servant but a poor master. In the new era of FEHA compliance, people must remain in control of employment decisions, particularly when disability accommodations are involved. Disclo is committed to this human-centered model, ensuring that your accommodation process is transparent, auditable, and driven by people, not opaque algorithms. By keeping humans in the loop, you ensure compliance doesn’t depend on trying to decode a black-box score.

Book a demo of Disclo to learn more.

Left arrow
Back

Disclo vs. “Black-Box” ADS Tools: Staying Human-Centered — and FEHA-Compliant — in the Age of AI

California's FEHA regulations now target "black-box" AI in hiring. Learn how automated decision systems (ADS) create legal risks and why a human-driven workflow tool like Disclo is the key to compliance.

Imagine this scenario: a talented applicant with a minor hand tremor completes a mandatory online typing test for a promising new job. An automated-decision system (ADS) processes the results and, without human review, flags the candidate’s score as “below threshold,” effectively ending their application. No one ever asks why the score was low or if an accommodation was needed. Under newly finalized amendments to California’s Fair Employment and Housing Act (FEHA), this silent, automated rejection could expose an employer to significant legal liability.

The new regulations, which take effect on October 1, 2025, place direct responsibility on employers for the automated tools they use. They signal a clear mandate from regulators: technology can assist, but it cannot replace human oversight, especially when it comes to disability and equal opportunity. This is precisely why we created Disclo—not to make decisions, but to facilitate a transparent, structured, and human-guided accommodation workflow when it matters most.

What the FEHA now defines as an ADS

The amended FEHA casts a wide net, defining an "automated-decision system" as any computational process that uses machine learning, AI, or statistical modeling to make decisions or facilitate human decision-making. This includes a broad range of tools that have become common in modern recruitment:

  • Résumé screeners that parse and rank candidates.
  • Chatbots that conduct initial interviews.
  • Video interview analysis software that scores facial expressions or vocal tone.
  • Gamified assessments that measure cognitive ability or reaction time.

If these tools are used to screen, hire, or promote, they are considered an ADS under the law.

The hidden risks of "black-box" algorithms

Many ADS tools operate as "black boxes," where the internal logic is proprietary and opaque. Employers may not know precisely why a candidate was rejected, making it nearly impossible to defend against claims of discrimination. These systems can inadvertently penalize individuals with disabilities. For example:

  • An AI analyzing vocal tone may misinterpret the speech patterns of a neurodivergent individual.
  • A reaction-time game could unfairly disadvantage someone with a motor impairment.
  • A system requiring prolonged screen focus might screen out an applicant with a vision-related disability.

Under the new FEHA rules, relying on such a tool without offering an alternative could be a costly mistake.

The new accommodation mandate: A proactive approach

A cornerstone of the updated regulations is the requirement for proactive accommodation. If an ADS evaluates traits that could be affected by a disability, employers are now obligated to provide a reasonable accommodation or a non-AI alternative before using the ADS to make a decision. This shifts the burden onto the employer to create an "off-ramp" for applicants who may be unfairly assessed by an algorithm.

This is where the distinction between a decision-making ADS and a workflow tool becomes critical.

Side-by-side: ADS tools vs. Disclo

Feature Typical “Black-Box” ADS Platform Disclo’s Human-Centered Platform
Primary function Automates and makes decisions (hire/reject, promote/pass). Automates documentation and workflow; humans make all final decisions.
Transparency Opaque scoring; logic is often a trade secret. Provides a full, visible audit trail for managers and auditors.
Accommodation pathway Often an afterthought or completely absent. Built-in from the start; employees can request assistance at any point.
Bias mitigation Relies on vendor-provided, often unclear, bias testing. No algorithmic scoring based on disability; designed for human review.
Record retention Varies by vendor; the employer is responsible for capturing outputs. Immutable logs are retained for over 4 years, exceeding FEHA’s new rule.

Why this distinction is your best defense

Liability for ADS-driven discrimination falls on both the employer and the vendor. By choosing a workflow management tool like Disclo, which is not an ADS, employers can dramatically limit their legal exposure. Disclo provides a centralized, efficient system for managing accommodation requests without handing over the critical decisions that impact people's livelihoods to an algorithm. You get the benefits of streamlined processes without the risks of automated bias.

Action steps for HR and legal teams before October 1

  1. Inventory your HR tech stack: Go through every tool used in your hiring and promotion processes. Clearly label each one as either a decision-making ADS or a workflow-assistance tool.
  2. Create accommodation "off-ramps": For every tool identified as a true ADS, you must establish and communicate a clear path for applicants to request an accommodation or an alternative assessment. This process must be in place by October 1, 2025.
  3. Prioritize documentation: FEHA now mandates a four-year retention period for all records related to ADS use and associated personnel decisions. This includes the data used by the system and its outputs. Disclo’s platform is designed to meet this requirement automatically.

Conclusion

The message from California’s regulators is unambiguous: technology is a powerful servant but a poor master. In the new era of FEHA compliance, people must remain in control of employment decisions, particularly when disability accommodations are involved. Disclo is committed to this human-centered model, ensuring that your accommodation process is transparent, auditable, and driven by people, not opaque algorithms. By keeping humans in the loop, you ensure compliance doesn’t depend on trying to decode a black-box score.

Book a demo of Disclo to learn more.

Curious to see how accommodations can support your employees?

Schedule a free demo today.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Share this article
Related Articles