Imagine this scenario: a talented applicant with a minor hand tremor completes a mandatory online typing test for a promising new job. An automated-decision system (ADS) processes the results and, without human review, flags the candidate’s score as “below threshold,” effectively ending their application. No one ever asks why the score was low or if an accommodation was needed. Under newly finalized amendments to California’s Fair Employment and Housing Act (FEHA), this silent, automated rejection could expose an employer to significant legal liability.
The new regulations, which take effect on October 1, 2025, place direct responsibility on employers for the automated tools they use. They signal a clear mandate from regulators: technology can assist, but it cannot replace human oversight, especially when it comes to disability and equal opportunity. This is precisely why we created Disclo—not to make decisions, but to facilitate a transparent, structured, and human-guided accommodation workflow when it matters most.
The amended FEHA casts a wide net, defining an "automated-decision system" as any computational process that uses machine learning, AI, or statistical modeling to make decisions or facilitate human decision-making. This includes a broad range of tools that have become common in modern recruitment:
If these tools are used to screen, hire, or promote, they are considered an ADS under the law.
Many ADS tools operate as "black boxes," where the internal logic is proprietary and opaque. Employers may not know precisely why a candidate was rejected, making it nearly impossible to defend against claims of discrimination. These systems can inadvertently penalize individuals with disabilities. For example:
Under the new FEHA rules, relying on such a tool without offering an alternative could be a costly mistake.
A cornerstone of the updated regulations is the requirement for proactive accommodation. If an ADS evaluates traits that could be affected by a disability, employers are now obligated to provide a reasonable accommodation or a non-AI alternative before using the ADS to make a decision. This shifts the burden onto the employer to create an "off-ramp" for applicants who may be unfairly assessed by an algorithm.
This is where the distinction between a decision-making ADS and a workflow tool becomes critical.
Liability for ADS-driven discrimination falls on both the employer and the vendor. By choosing a workflow management tool like Disclo, which is not an ADS, employers can dramatically limit their legal exposure. Disclo provides a centralized, efficient system for managing accommodation requests without handing over the critical decisions that impact people's livelihoods to an algorithm. You get the benefits of streamlined processes without the risks of automated bias.
Conclusion
The message from California’s regulators is unambiguous: technology is a powerful servant but a poor master. In the new era of FEHA compliance, people must remain in control of employment decisions, particularly when disability accommodations are involved. Disclo is committed to this human-centered model, ensuring that your accommodation process is transparent, auditable, and driven by people, not opaque algorithms. By keeping humans in the loop, you ensure compliance doesn’t depend on trying to decode a black-box score.
Book a demo of Disclo to learn more.
California's FEHA regulations now target "black-box" AI in hiring. Learn how automated decision systems (ADS) create legal risks and why a human-driven workflow tool like Disclo is the key to compliance.
Imagine this scenario: a talented applicant with a minor hand tremor completes a mandatory online typing test for a promising new job. An automated-decision system (ADS) processes the results and, without human review, flags the candidate’s score as “below threshold,” effectively ending their application. No one ever asks why the score was low or if an accommodation was needed. Under newly finalized amendments to California’s Fair Employment and Housing Act (FEHA), this silent, automated rejection could expose an employer to significant legal liability.
The new regulations, which take effect on October 1, 2025, place direct responsibility on employers for the automated tools they use. They signal a clear mandate from regulators: technology can assist, but it cannot replace human oversight, especially when it comes to disability and equal opportunity. This is precisely why we created Disclo—not to make decisions, but to facilitate a transparent, structured, and human-guided accommodation workflow when it matters most.
The amended FEHA casts a wide net, defining an "automated-decision system" as any computational process that uses machine learning, AI, or statistical modeling to make decisions or facilitate human decision-making. This includes a broad range of tools that have become common in modern recruitment:
If these tools are used to screen, hire, or promote, they are considered an ADS under the law.
Many ADS tools operate as "black boxes," where the internal logic is proprietary and opaque. Employers may not know precisely why a candidate was rejected, making it nearly impossible to defend against claims of discrimination. These systems can inadvertently penalize individuals with disabilities. For example:
Under the new FEHA rules, relying on such a tool without offering an alternative could be a costly mistake.
A cornerstone of the updated regulations is the requirement for proactive accommodation. If an ADS evaluates traits that could be affected by a disability, employers are now obligated to provide a reasonable accommodation or a non-AI alternative before using the ADS to make a decision. This shifts the burden onto the employer to create an "off-ramp" for applicants who may be unfairly assessed by an algorithm.
This is where the distinction between a decision-making ADS and a workflow tool becomes critical.
Liability for ADS-driven discrimination falls on both the employer and the vendor. By choosing a workflow management tool like Disclo, which is not an ADS, employers can dramatically limit their legal exposure. Disclo provides a centralized, efficient system for managing accommodation requests without handing over the critical decisions that impact people's livelihoods to an algorithm. You get the benefits of streamlined processes without the risks of automated bias.
Conclusion
The message from California’s regulators is unambiguous: technology is a powerful servant but a poor master. In the new era of FEHA compliance, people must remain in control of employment decisions, particularly when disability accommodations are involved. Disclo is committed to this human-centered model, ensuring that your accommodation process is transparent, auditable, and driven by people, not opaque algorithms. By keeping humans in the loop, you ensure compliance doesn’t depend on trying to decode a black-box score.
Book a demo of Disclo to learn more.