Scale
Inaccessible patterns spread faster than they can be reviewed and fixed. There isn't enough accessible training data available, and RL techniques of AI vendors do not cover accessibility well.
Public Digital Access Crisis
This is not a minor bug trend. It is a systematic civil-rights failure that has now found its way into software as a whole, through lightning-fast adoption of AI systems that are trained on over 20 years of institutional barriers.
Accessibility is a baseline for participation in modern life. When developers automate UI development, they automate discrimination from rights, services, and opportunity.
Inaccessible patterns spread faster than they can be reviewed and fixed. There isn't enough accessible training data available, and RL techniques of AI vendors do not cover accessibility well.
High-impact services deploy AI code in systems people depend on daily, like education, healthcare, finance, and jobs.
As in the years before AI, almost nobody talked about digital accessibility. In the public eye, it is seen as an issue purely of the disabled. But: It is also an issue of quality, equality and broader society.
Treat accessibility as enforceable law in AI systems: hold vendors accountable, and enforce penalties for failure to comply. If you don't know where to start, contact us! We're happy to help, and we have plenty of experience.
Refuse to ship discrimination. Block a release on failed accessibility checks, repair the code, and protect users before launch. You can easily implement automated accessibility checkers, and then learn about manual testing.
Your experience is not edge-case feedback. It is frontline evidence. Report barriers and demand public timelines for repair. Hold your instituions accountable.
Stop externalizing harm. Raise default accessibility quality by a massive margin, and accept independent scrutiny. Work together with disabled users on this!