Managing inclusivity debt
My life is extraordinarily good—I feel lucky and hardly notice my blindness, though I love calling it an affliction. I can accept bad infrastructure, but not lazy digital inaccessibility. Conversational voice AI can finally repay the world’s vast inclusivity debt.
This post concludes with an ask, so please do read on.
Firstly, some context on me. Last week a new acquaintance who has already become a friend forever, asked “How do you stay so positive despite your affliction”. I literally heard him choking as he tried to suck the words back from the ether, appalled at his own language.
But my blindness *is* an affliction, and it is an affliction I would seriously rather not have. Referring to me as visually divergent, or differently abled, does not change the reality. Personally I would sooner people call a spade a spade rather than skirting questions and issues, expending more effort on using the right words rather than expending effort on doing the right things.
For me, blindness is just one of the cards in the hand I was dealt at birth, which also included the Ace of Spades (a brain combining good memory, unusual focus and dogged determination with what seems to be relatively quick wit). Over my lifetime, that hand has been augmented by several kings in the form of amazing friends and colleagues, an incredibly caring and beautiful queen of hearts (you know who you are), and another ace (financial independence before the age of 40).
So amongst that hand of riches, the very tattered and admittedly shitty 2 of clubs, which is my blindness, honestly feels like a very, very minor negative.
Since I embraced my white cane as a light sabre and not as a personal badge of shame, I am constantly stunned by the humanity of those around me. On 2 or 3 occasions, a commuter helping me to my train has offered to accompany me on my journey, 2 hours out of their way, simply because they had time available and wanted to help.
Overall, I feel simultaneously immensely privileged to have been dealt such a good hand in the round, and hugely grateful for the sheer humanity of friends, colleagues and complete strangers alike.
I also recognise that adapting ageing physical infrastructure to make it blind friendly is extremely hard and does not make economic sense. But what does make me seethe in frustration is the unnecessary exclusion of people with vision impairments from the digital world. Digital technology should really, really be mitigating the challenges inherent in the physical world, rather than adding to the barriers that people with disabilities face.
Exclusion can sometimes be a conscious initial choice. I assume that mobile phone companies in the early 90s realised that SMS messages would not be accessible to blind phone users because early mobile phones could not run speech synthesis. It would have been possible to provide a service where blind users could forward their text messages to a network service (software or human) that read the message aloud. But afaik no major network offered this service from day one, quite likely because SMS was just a low budget hack/novelty, no-one expected that texting would drive a profound social change where young people abandoned speaking in favour of exercising their thumb muscles.
Once the exclusion of blind phone users had happened nobody had the will or moral compass to fix this wrong for years to come.
Other times exclusion comes about through inadequate accessibility testing or a conscious trade-off between getting a mainstream product to market as quickly as possible vs adding accessibility tags to user interface elements. Check out the cover image which is not an exaggeration - I have genuinely experienced this sort of mindless and unnecessary barrier on several occasions.
I call the long term persistence of inclusion barriers introduced in the heat of product development 'inclusivity debt' by analogy to 'technical debt' - a concept familiar to all software engineers.
Whether it is incurred knowingly or unknowingly, technical debt ranges from inadequate documentation, through non-critical bugs, to short term design choices which ultimately limit scalability, resilience or future enhancement.
Technical debt creates increasing pain for development teams as the software rots, so it is always in their face and they will periodically delay urgent feature releases in favour of fixing accrued technical debt before the entire house of cards collapses. In contrast unfixed inclusivity debt only impacts product usage by a largely invisible minority and not the engineers themselves. It is also unlikely to affect the product’s mainstream reputation. Consequently poor accessibility persists getting harder and harder to fix as a product matures.
For obvious personal reasons my focus is predominantly inclusivity for the vision impaired, but the barriers to inclusion exist for many, many groups. Although my examples and arguments will usually be from a VI perspective, I am not in any way arguing for the rights of the vision impaired community relative to any other users excluded by unthinking or uncaring choices made during digital product development. This is a complex issue due to the diversity of diversity, but it’s important and we have a generational opportunity to fix it.
Companies with valuations in the hundreds of billions or trillions of dollars will unavoidably accrue inclusivity debt as they race to release new features, acquire more customers or comply with tough regulations. But with those outsize valuations and their frontier technology, the AI behemoths are uniquely well placed to drive a generational shift in digital access
There are well established standards for web access, such as WCAG, ARIA and many others.
There is no excuse for major companies not conforming to these guidelines. OpenAI are by no means the only culprit here but it is somewhat ironic that when ChatGPT provided me with an excellent summary of the current standards it simultaneously exposed a button in its own interface labelled 'Unlabelled 0 Button'. I genuinely have no clue concerning the function of this button, any more than I understand the functions of the buttons that appear from time to time where the '0' is replaced by other random digits that seem to increase as the conversation progresses.
Conforming to every web accessibility standard under the sun will not make an interface accessible if it has not been exhaustively tested by the diverse user groups which the standards try to accommodate.
And even then, skilled blind users waste inordinate amounts of time and energy exploring and navigating a fundamentally visual interface through dozens of arcane key combinations and lengthy audio renditions of a complex screen designed to be visually scanned not read word by word.
Fortunately we have a much more profound opportunity. The increasing importance of AI agents navigating websites on behalf of users is driving structural changes to browsers and web pages to ensure the agents make best use of the information they find.
As web pages, browsers and Large Language Models co-evolve over the coming months and years there is a genuine opportunity to improve accessibility with conversational voice agents allowing blind users to access online content just as well as sighted users.
It would be a real missed opportunity if the VI community are not properly engaged in this rapid evolution to avoid yet another unintended backward step in the dance between inclusion and digital technology.
Today's massive global mountain of inclusivity debt is enough to make any chancellor weep . I would love to share some specific thoughts with the frontier AI providers on the use of co-developed conversational voice agents to manage and ultimately retire it.
My ask: IF you know any execs at Open AI, Anthropic, Google, Apple, Amazon, XAI, MS and Meta who own accessibility at the strategic level then I would be incredibly grateful for a connection, so please do email me or message.