Further thoughts about Jakob Nielsen's statement that accessibility has failed

Published on July 2, 2024

Summary

AI cannot fix accessibility until those who program it understand that accessibility is about disabled people, and as long as systemic ableism continues to exist, AI won't be able to fix accessibility.

“Traditional accessibility methods are dead”. That’s the take-away I, and so many other people in the digital accessibility community took from an article Jakob Nielsen wrote in February 2024. I said on LinkedIn that I couldn’t trust anything Nielsen writes anymore because of this take. A response to my comment leads me to write this.

All y'all are looking at accessibility through the wrong lens.
Human-shaped robot sitting on a wooden bench, reading a book.
Photo by Andrea De Santis on Unsplash

Changing the paradigm

Accessibility hasn’t failed. People and organizations have failed to implement accessibility. That’s a significant difference. I’ve said it before and will keep saying: Accessibility is a tech solution to a human problem. As long as we keep looking at accessibility as a “technical hotfix”, we are doomed to fail. Digital accessibility is not, has never been, and never will be, a technical issue.

Here’s what Nielsen says.

Traditional methods for accessibility have been tried for 30 years without substantially improving computer usability for disabled users. It’s time for a change, and AI will soon come to the rescue with the ability to generate a different user interface for every user, optimized for that person’s unique needs.

Jakob Nielsen - Accessibility Has Failed: Try Generative UI = Individualized UX

Nielsen outlines two main reasons to explain why “accessibility has failed”:

  • It’s too expensive
  • It creates substandard user experiences

He wants to use AI to create individualized user experiences. Fine. I’m sure there’s some of that which could help. I do keep saying that a great way to improve accessibility is to let users in control of their preferences. So in some ways, I agree here.

AI ain’t the future

But, as long as we keep looking at digital accessibility as solely a tech problem, we are, as Nielsen says, doomed to create substandard user experiences.

I’m not looking at AI to solve all our accessibility problems. In fact, it’s no secret that I believe we’re overly relying on AI for so many things. Quite aside from the environmental impact of AI, the fact is, it’s just not ready for prime time. Maybe some day it will be reliable enough, but right now, it’s not.

To suggest we can rely on generative AI to make everything more accessible is… dreaming. To think AI can fix accessibility is looking at accessibility issues as a tech problem.

Accessibility is too expensive? I think not!

Sure, accessibility costs. No hiding the fact that it’ll cost to get accessibility done.

But it’ll cost you a lot more to try and retrofit accessibility in your experience rather than get it done from the start.

It will be negligibly more expensive to build a house with no step entrance and wide enough doors compared to building a house with steps and narrow doors. What is expensive is trying to remove steps and widen doors.

That’s your problem - your designers don’t know accessibility. Your developers don’t know accessibility. Your QA teams don’t know accessibility. So you build things that aren’t accessible from the start. Then you have to fix things, typically in a hurry. And the costs mount up.

It wouldn’t be much more expensive if you had accessibility in the organization’s DNA though. Because you’d just be doing it.

Security and performance should also be left alone if we were to look at those through the lens of “it’s expensive”. But of course, no one is talking about not implementing those aspects of digital properties.

Systemic ableism

Those of us in the disability community who have been doing any level of advocacy for accessibility and inclusion, regardless of whether it’s in the built environment, employment, or digital world, we know the problem is attitudinal. The problem is society. The problem is systemic ableism.

I’d go further and suggest that Nielsen’s piece reflects this utter ableism.

We must first understand our users. But we must also want to make things more accessible. Slapping some generative UI on a platform will not make a site or application any more accessible than using an accessibility overlay.

Until society sees and thinks about disabled people as people, we are doomed to not have an accessible digital world.. Until society considers accessibility a civil right, we are doomed to not have an accessible digital world.

Accessibility: It’s about people. It’s not about tech. Accessibility is about disabled people trying to function in a broken system that wasn’t designed with accessibility in mind.

People who program and teach AI have a big responsibility. As long as their beliefs and understanding of disability reflects the current systemic ableism, AI will be doomed to fail at accessibility. AI can only everbe good for accessibility when the people who teach and program it have internalized a couple concepts:

  • Disabled people are people.
  • Disabled people have needs.