Beta Testing the Future of Accessible Travel.

Smart glasses and a hotel app that bypasses the touchscreen thermostat — a few days in Silicon Valley make the case that conversational voice agents could finally bring blind accessibility to the physical world. And if everyone uses them, accessibility comes for free.

Image of a smart phone controlling hotel room devices
audio-thumbnail
Audio Narration
0:00
/327.9412244897959

I love the San Francisco Waymos and I love the vibrancy and density of the tech community here.

On Sunday I met with the CTO of EchoVision, developing Smart Glasses (think Meta Ray-Bans) but targeted at vision-impaired users from the outset. All their design decisions are accessibility first, but this looks and feels like a mainstream consumer device. The glasses are lightweight and stylish, the audio quality is great and their solution to battery life is streets ahead of the Meta glasses. The founding team have serious tech industry experience and credentials and great connections into the Chinese electronics manufacturing community — and it really shows.

Even the packaging oozes quality and the unboxing experience is akin to opening up an Apple product. But I won't need Father Christmas to pop a pair down my chimney, as I'm now a proud participant in their beta programme.

It's early days, but my first impression is that the text recognition and scene interpretation is significantly ahead of Live AI on the Meta Ray-Bans. I can actually envisage wearing these glasses to read a restaurant menu this evening! Note to self: check with the waiter before accidentally ordering a $25hundred bottle of wine because my glasses only said $25.

Travelling here (or indeed anywhere) independently is always an adventure. I could bang on about finding your taxi, finding your room, or accidentally calling the emergency services with the wrong button in the elevator (yesterday's little entertainment — the alarm was a sound to behold, but fortunately the call handler was very understanding).

But sometimes it's the littlest things that irritate me the most, and sometimes those littlest things also give me the greatest joy. So let's talk about temperature control.

In-room thermostats used to have very blind friendly tactile buttons. OK, I could never read the temperature display, but you can generally feel if the room's too hot or too cold — and if you cannot, then presumably you don't need to adjust the temperature. That is, unless you suffer from chronic thermoception disorder

For me, the tactile buttons are infinitely more useful than the display. You can imagine my creeping frustration over the last decade as more and more of these simple little gizmos were replaced with utterly unusable touch screens. Doubtless very techno-trendy and possibly a few dollars cheaper, but just one more example of the unintended accessibility consequences of technological 'progress'.

Unsurprisingly, my hotel in Mountain View, the epicentre of Silicon Valley and home of Google, does have techno-trendy touch screen thermostats. But now for the little moment of joy: the hotel's mobile app actually has useful features, including an entirely accessible way to change my room temperature.

This mobile thermostat feature is probably not a direct response to the inaccessibility of the touch screens, but it does highlight a great way to solve diverse accessibility needs.

The accessibility of any physical device controllable from a smart phone is only limited by the accessibility of each user's own smart phone.

With a smart phone tailored to my particular needs (i.e. with a speaking interface), then almost for free I get a speaking interface to this hotel's thermostats and to any other device with a clean mobile UI.

This is my vision for how conversational voice transforms blind accessibility to more than the digital world. I simply say 'turn the air con up a couple of degrees'. My agent already knows the hotel from my calendar and/or my location, so it interrogates the hotel's heating service, makes the change after authentication and reports back.

Conceptually this is sort of Alexa or Siri on steroids. But both of those products fell woefully short of their promise, partly because the AI tech simply was not ready and partly because the ecosystem never materialised.

We are now at a watershed. The AI is good enough, and standards like WebMCP enable a joined-up ecosystem of vendors from tech behemoths to AC suppliers.

Adoption into the physical world will of course take time, but I'm optimistic that the smart phone, with a conversational voice agent, may become the way we all access services from airline reservations to in-room heating.

And if everyone does it, then blind accessibility comes for free — which is how it should always be.

On Monday I shared this vision with Chris Fleizach, head of VoiceOver engineering for the iPhone. Over the past 20 years Chris has probably done more than anyone else in the world for blind access to the digital world. I'm hugely grateful to him and salute his massive contribution.
I'm sure a few more days in the Silicon Valley innovation pressure cooker will inspire me with the huge benefits AI can bring to all our lives and to a more accessible world. See, I've already drunk the Kool-Aid, and I've only been here 36 hours.