The other day I was trying to book an Uber but the app showed that there weren’t any cars near me. I tried another taxi app and – as expected – no cars there either.
I was anxiously waiting hoping for an Uber to pop up on the map, when my wife called me. She asked me whether I was already on my way to her and I told her about the cab situation.
Immediately, she suggested that I try out a new app, called Indriver. I installed it and bam – I could see like a dozen cars. I hired one and it arrived in a couple of minutes. Problem solved.
What just happened? I got exactly the help I needed exactly when I needed it. Let’s call it ‘contextual help’ – because it depends on the context. What if there was someone to always provide you with such help? Impossible, right?
Not really. Google wants to do it by making Google Assistant offer you help without you even asking. Purely based on context.
As an aside, I had mentioned in an earlier post a Google’s patent, which showed Google Assistant to be available during video conferences, ready to provide any help you ask for. Things like running searches, sharing content on screen, etc.
Today I came across another patent which shows Google Assistant providing contextual help all on its own, even when you are in other apps. Let’s see some examples:
The first one is fairly simple but considerably useful. Suppose you are chatting with your friend and you are finalizing details of a meet up. As soon as you agree on the time and place, the Google Assistant could offer to block your calendars. One less thing to do, right?
Now, I want you to take a pause here and appreciate how it’s different from the assistants we have today. The current assistants such as Siri, Google Assistant, and Cortana do only what they are told and they have very limited access to apps.
If I ask Siri, for example, to book an Uber for me, it seems to understand what I am asking, but responds with “Sorry, I can’t help you with that.” Another example – if I ask it to send a WhatsApp message to my friend, it opens up WhatsApp but asks me to type in.
Contrast this with the capability shown in this patent – not only can the assistant “see” and “interpret” what is going on (the chat and what is being discussed) but also has the capability to add an event to the calendar.
Another example from the patent shows a situation very similar to what I described about my cab. Note how this is much more than merely adding an event to a calendar:
In yet another example, the Assistant senses that you just reached your home and that last night you set your thermostat to a particular temperature. So, it shows you the option whether you want to set the thermostat to the same temperature again. This, of course, is possible only when you have a smart thermostat that can be controlled via WiFi, which Google already sells.
Of course, these examples are just an idea of how contextual help could be provided and they barely scratch the surface of what is possible with such technology. Imagine some other possible use cases:
- Reminding you to and helping you choose a gift for your friend’s wedding a few days in advance
- If you are a student, helping you revise concepts you learned earlier by automatic spaced repetition
- Offering background about an article you are finding hard to comprehend
- Helping you select a nice tie to with the shirt you just purchased
- Showing you basic info (such as LinkedIn accounts) of any new person from whom you receive an email, etc.
The possibilities of providing contextual help are just endless. In fact, once you get used to these little conveniences, it might be impossible to get back to doing everything on your own.
It is not known to what extent Google has already developed this technology. However, considering that this idea is popping up in multiple patents, it is likely being developed towards being production ready. Chances are, it could be part of Android 13. Provided that Google can win the trust regarding the privacy issues surrounding such features, I think it would be a great addition in the Android ecosystem. Let us know what you think in the comments.
If you liked this article, please subscribe to our weekly newsletter. We share the best stories from the week in it.
Cover Photo by Masakaze Kawakami on Unsplash