Conversations and Memories - Introducing Hobson

Seamless conversation and memory have been Huge historical barriers to effective digital assistive solutions. Hobson is a mainstream voice first conversational personal assistant with a great memory and profound assistive potential.

A butler saying to a blind man “I trust this meets with Your lordship’s approval
audio-thumbnail
Audio Narration
0:00
/449.0448979591837

Here are a  couple of blind scenarios played out with various forms of assistance. If you’re not familiar with Be My Eyes, check out last week’s post. BME also includes a rudimentary Be My AI feature which analyses images. The Meta Ray Bans, which I use as the camera for Be My Eyes, have recently introduced a more sophisticated, semi-realtime AI feature. I say semi-realtime, because the frame rate is extremely low and the latency is unworkable for feedback while moving through a space. Having said that, Meta Live AI holds significant promise through a combination of the form factor, a conversational voice interface, access to Meta’s formidable AI resources and the company’s commitment to inclusivity.

Scenario: I am holding a box of meds from my bathroom cabinet 

Three alternative attempts to identify the meds as follows.

  1. Me:  [press the Be My AI ‘take a photo’ button  while looking at the meds, Be My AI  [after 10  seconds of analysis]: “This is a bathroom with tiled walls, a sink, a toilet and a window behind a pot plant and other decorative items.”, Me: “No shit Sherlock”
  2. Me: “Hey Meta Live AI, what am I holding?”, Meta Live AI “You are holding a box of pills”, Me: “what specific pills?”, Meta Live AI: “You are holding a box of 50 milligram Levothyroxine pills” 
  3. Me: “Er, Darling?”, Mrs Mairs: “Thyroxine”

This is a very basic scenario, but it illustrates the importance of conversation in establishing the specific assistive need. In the last 10 years I have seen innumerable demos of allegedly helpful AI image analysis which almost invariably provides too much detail, too little detail or simply the wrong detail. Now that we can have natural voice conversations with our AI, it becomes so much easier to guide the AI to the specific question. 

Meta Live AI did very well, although it took me 2 or 3 attempts to convince it that I was not asking for medical advice which it steadfastly refused to go anywhere near. 

More subtly, my conversation with my wife was super optimised because we added memory to the conversation. Mrs. Mairs knows that ‘Er darling’ is code for ‘Please could you help me with…’. She also knows that I’m rubbish at keeping track of my medicines and she knows that I have 2 different meds in identical packaging.

Scenario: crossing the car park

Navigation obviously needs a conversation  with lots of feedback as it is very dynamic and potentially dangerous, and it also requires the sort of route following, object identification and collision avoidance that autonomous vehicles use, combined with very succinct, relevant and   precise instruction to the blind adventurer. Right now, I have not found a reliable AI powered solution and so Be My Eyes seems the best alternative for augmenting my trusty white stick, although it is still not without it’s challenges…

Here are two attempts to use Be My Eyes with a human assistant.

  1.  Be My Eyes anonymous volunteer: “turn left to avoid the car”, Me: spontaneous hopping fest, accompanied by a fusillade of F bombs, while waving white stick like some sort of demented Morrisman, volunteer: “Oops, sorry, didn’t notice the kerb”
  2. Mrs Mairs via Be my Eyes: “Turn a bit left to go around the car in front of you, but remember the kerb which is just to the left of that car. 

These scenarios illustrate the two most critical omissions from most of today’s assistive  solutions: conversational voice and memory.  This is all changing with the imminent debut of Hobson, who has a conversational voice, a photographic memory and more discretion than a royal equerry.

Hobson is a mainstream digital assistant, highly optimised for use without a visual interface. Think of the ultimate human Personal Assistant. These highly skilled professionals do not have a screen for conveying information to their employer - they use their voice. And their value comes from their detailed knowledge of their employer’s habits, preferences, relationships and history. They treat the information they hold with extreme discretion and understand exactly when to seek confirmation or further instruction. So Hobson’s development  is an ambitious (until now, stealth) project drawing on the very best characteristics and behaviours of those human assistants. It is most definitely not a quick vibe coded lash up simply wrapping agentic capabilities around state of the art foundation models.

Future posts will go into much more detail on a broad and evolving  set of capabilities, but Hobson already formatted and published this post and will be summarising for me any comments or reposts you’d like to offer on ghost, X and LinkedIn.

Note: The obsequious butler in the image for this post implies nothing about the gender or ethnicity  of the bag of bits and bytes we’re calling Hobson. But I did have fun creating the image, even though it will doubtless encourage yet more regular use of the ironic His Lordship amongst Mrs Mairs’ friends, and a  prolonged exploration on the therapist’s couch of my fantasy life as a country squire or 18th century aristocrat.

Over the coming months I hope every one of you will come to know and love Hobson as your own genuinely personal assistant . And feel free to anthropomorphise that personal assistant however feels most natural to you.

I’m extremely excited about what Hobson could be, first and foremost as a mainstream assistant and secondly, with some judicious extensions, as an assistive agent to shepherd me and other vision impaired users through the physical and digital worlds, breaking down barriers, acting on my behalf and escorting me  safely on this journey.

For clarity, Charles Williamson is the brilliant driving force behind Hobson. I am simply an advisor and first blind user.

If you’re interested in becoming an early mainstream Hobson adopter, please sign up here. This early alpha release of Hobson provides users with a daily digest bringing together their email, calendar, projects and tasks, encouraging focus on the highest value activities, while allowing Hobson to help with the rest.

In the tradition of all new software, the early alpha releases will be buggy, feature deficient and constantly evolving. So don’t sign up right now if you want a finished product, but if you’re curious, tolerant and willing to provide feedback then we’d love to hear from you