Trends

10 ways Google Assistant can make people’s lives easier

Amid sharp competition from popular AI assistants like Siri and Alexa, Google Assistant arguably has a major advantage — the Android mobile operating system.
Smartphones are still the number one way people speak to their assistant, and as the smart display market grows, the assistant that interacts with YouTube, Google Maps, Google Photos, and Gmail could change the way people think about computing. We expect Google to announce major improvements to Google Assistant at the company’s annual I/O developer conference in May, but there are a few ways I think Google Assistant can be better.
None of these three digital assistants is perfect, and even after years of interaction with people, they all have some non-trivial issues to address this year. This week, three VentureBeat writers are spotlighting 10 important issues with the digital assistants they use most, starting today with Google Assistant, with Siri following Thursday and Alexa on Friday.

1. Send text messages with Home speakers

There’s a reason this is the first item on the list: It’s really hard to believe this isn’t a native feature for Google Home speakers yet, particularly following the death of Allo last month and the recent introduction of Google Assistant suggestions for Android Messages.
Sending text messages with Google Assistant is a pretty good experience on an Android phone, and Google Assistant can even send and read messages from Google Maps now, but say “OK Google, send mom a message” to a Home speaker today and you’ll hear “Sorry, I can’t send messages yet.”
Even Echo speakers can send text messages.
We know pretty well the kinds of things people tend to do with a smart speaker. Among top use cases: Music is fun, news is helpful, and so are timers and reminders. But messages are a critical element necessary to cementing the value of voice computing in people’s minds. More so than emails or calls, messaging has become the de facto form of communication in a lot of people’s lives.
Many people today have yet to realize what’s been true now for years: It’s faster to type with your voice than it is to text with your thumbs. You don’t always want to send a message with your voice, of course — sometimes you have people over or want to say something private — but once you understand that conversational AI has grown by leaps and bounds in its recognition of the human voice, voice computing can save you a lot of time.
It also allows you to get something done as soon as it comes to your mind, not when your hands are free.
Personally speaking, there are many moments when I’m busy cooking dinner, for example, and need to send something to a loved one. Voice computing makes it possible to just say what you need.
By comparison, phone calls with Google Assistant today are great, and appear to be where Google focused its attention since the feature was introduced in late 2017. Lose your phone in the house and you can call it. Know the number you want to call and you can blurt it out. You can even ask Google Assistant about where the nearest Target or flower shop is before you know what number to call. Integrate your contacts and you can call people in your address book as well.
The addition of Slack, Facebook Messenger, WhatsApp, and other popular messaging services would also be nice. Some of these services can be integrated today with IFTTT applets, but sending messages with your voice should really should be part of the native Google Home experience.
One way to send messages with Home speakers today is with the Broadcast feature on Google Assistant, but this is limited to people in your household. The ability to send text messages with Google Home speakers is an infinitely different thing, and opens the world to the wider circle of people who matter most to you.
That’s what makes this one of the biggest features missing from Google Assistant today.

2. Share bus routes on demand

Tell Google Assistant where you live and your work address, and every morning your Pixel smartphone can show you the time for the next bus or train that you take most often to work. You can also ask a Google Home smart speaker to send directions to your phone. But you can’t yet say, “OK Google, when is the next 5 bus downtown coming?” or “When is the next San Francisco-bound BART train due?”
Google Maps has made strides in the direction of sharing better information with people getting around without a car. In the past year or so, the option to set your commute has extended to walk, bike, or a combination of car and public transportation. The mobile Maps app even informs you when it’s time to get off the bus, making it easier for people who don’t routinely take public transportation to be aware of when they should transfer. You can also choose a preferred route, so Google knows which specific buses and trains you want to take to work.
It would be nice if this was extended to all local bus and train routes, but this feature could start with incorporating what Maps already knows, since knowing exactly when that next bus or train is coming is pertinent to people’s daily lives. I hope that a deeper integration with Google Maps will also mean that Home speakers will soon be able to answer this question.
I’ve long felt that this is a shortcoming in the AI assistant experience with Alexa, Siri, and Google Assistant.
There are Google Assistant voice apps that share bus and train routes for some areas, but this shouldn’t be a toy test case for voice apps. This is something with the potential to have far-reaching implications, because enabling people to ride public transportation with less hassle can reduce traffic congestion and curb carbon emissions.
Given the proliferation of smart speakers to most U.S. households in the years ahead — and things like mini scooters and bike rentals changing last-mile travel — being able to answer these questions with voice alone could really give people a more seamless way to get around, ultimately benefiting their daily lives and the environment.

3. A more fluid voice-video-text experience

Whatever you think of Facebook’s Portal, and whether you trust a company like Facebook enough to let it into your home, something the device has made abundantly clear is that symmetry between a messaging app on your smartphone and a smart display in your home is an amazing thing.
With Portal, you can begin a call on your phone, transfer it to a Portal, then sling it back to your phone when you leave a room. You can tether the experience to a smart display when you’re going to be in one place for a while, then go mobile whenever you’re ready.
This isn’t something that’s currently part of the experience with a Home smart display or Echo Show — but it should be. And it may become increasingly important as smart displays grow in popularity.

4. Tell me more about my day

Google Assistant’s My Day feature can show you your next meeting, give you directions, and run you through some other basic things. Routines also allow you to automate specific actions as part of your My Day routine so you can, for example, cue relaxing ambient sounds or hear your favorite news as part of the My Day routine. But there’s so much more that could be part of this experience.
The visual experience with Google Assistant after saying “Tell me about my day” got a series of enhancements last year that make it a much more pleasurable experience, with larger photos, GIFs, and more info pulled from Gmail to tell you about things like your next flight, when your next package is being delivered, or an upcoming event that includes a ticket reservation.
None of these events is mentioned today when you say “Tell me about my day.” Incorporating more need-to-know information will go a long way toward making this a powerful feature and daily ritual worth repeating.
The experience can also be also be simplified by playing your news from the Google Podcasts app. Today when you say, “Play the news” or “Tell me about my day” and news is automatically played, it’s controlled via the Google app instead of by Google Assistant or even the Google Podcasts app, which is just unhelpful for understanding what you’ve already heard and duplicates features unnecessarily.

5. Build better blue-pill rabbit holes

Continued Conversation with Google Assistant was introduced last month for smart displays, and it can be pretty cool.
Basically, it means you can now ask Google Assistant endless questions. Whereas before the device stopped listening after a response was given or action was taken, now you can just keep talking to your assistant.
Initially, I’m not sure how much this will alter how the average person uses their assistant, but it means you can begin with a single query and dive deeper following each response.
So far, however, it’s a bit of a buggy experience. Let’s say, for example, you’re heading to visit a new city soon, an example Amazon VP of devices David Limp offered last year when a similar feature was introduced for Alexa.
Ask Google Assistant to tell you about Seattle and you’ll get some basic facts. Respond by asking about fun things to do there and Google Assistant gets lost.
Ask a more straightforward factual question like “Who was the president in 1860?” and you’ll get a response you can riff on with follow-up questions for a while.
Building better blue-pill rabbit-hole experiences like these could change how people think about what’s possible with an assistant. That’s going to matter more as AI assistants and voice computing incorporate more of the travel experience, such as flight check-in.
An improved Continued Conversation could also give Google Assistant a leg up on Alexa, which is currently limited to a single follow-up question. More exchanges also mean more total recordings of people’s voices that can improve the assistant’s conversational understanding.

6. Get a productivity mode

While in resting mode, Google Assistant smart displays can act as a digital photo frame, while Amazon’s Echo Show can tell you when your next delivery is due. But it’s not yet possible to prioritize this space with your to-do list and calendar, and that needs to change.
Smart displays and perhaps even Android smartphones should have Productivity Mode, a way to put these things front and center, because it changes the way you think about what you can do with your assistant, and what’s possible with conversational computing.
When you urgently need to get things done, Productivity Mode could change the way the assistant behaves. Maybe it starts to share proactive reminders about when you need to leave home to make an important appointment, or reminds you that the store you need to visit closes in an hour.
This isn’t automating workflows so much as it’s helping people stay on top of the information they need to make decisions and accomplish tasks. (Editor’s note: You know, what assistants are supposed to do.)
Ultimately, there could be other modes for as well: Friday Night Mode might be more playful than the generic Google Assistant experience, or Sunday Morning Mode more calm. With the advancement of speech synthesis, this could even manifest itself with different-sounding voices or personalities like the John Legend voice that made its debut today. This ties with my previous argument that people, not tech companies, should have the power to choose the personality of the intelligent assistant.

7. Give daily recaps

Today the experience with Google Assistant is able to focus on what’s ahead, highlighting your next calendar event, and pointing to reminders, but it would be interesting if Google Assistant chose to look backwards and offer recaps or summaries of recent events.
And why not? Google Fit already follows your steps, heart rate, and sleep stats; Google Maps knows where you’ve been; and Gmail often contains information about your activities.
Digital usage recaps like the kind offered by Android’s Digital Well-Being would also be helpful.
The value of recaps isn’t always abundantly clear, but this could be an area of great potential for AI assistants, both in work and your personal life. A Harvard Business School study found that people who set aside 15 minutes to recap what they’ve done in a business day found improvements in their workplace performance.
Should this come to include automated transcriptions like the kind Microsoft and many startups create today for meetings, there’s a lot more information that can be added, such as meeting highlights worth remembering or reminders about things you agreed to do. Google already surfaces reminders about things you agreed to do from Gmail.
In my personal life, I want to know if I’m more than 20 percent behind on my daily steps, and whether if I take a walk now, I can get in the rest.
Sensitive information could introduce a need for consent before sharing details over a smart speaker about how you spent too much on eating out this week, or perhaps only sharing said information on a display with no voice, but I really can’t imagine more valuable insights than keeping you in good physical and financial health. It could even help celebrate walking streaks or meeting a savings goal.
This also gets to the idea of a lifetime assistant — AI that can help professionals do their jobs or give people guidance on how to live healthier lives. The lifetime assistant didn’t seem like a real possibility five years ago, but is much more realistic today as speech recognition error rates continue to fall and conversational AI continues to better understand people’s words.

8. Suggest routines

Routines let you create custom voice commands in the Home app or carry out multiple tasks with a single utterance. For example, you can create a routine triggered by “OK, Google, good night” that turns out the lights, enables Do Not Disturb on your phone, and launches a Google Assistant action that plays ocean sounds.
Google Assistant and Alexa have pushed their routines for a long time now, while Siri’s Shortcuts brought similar features to Apple devices.
Routines with Google Assistant were initially limited to a few specific tasks, but the feature has become more flexible over time.
The idea of Routines is not just customization or giving people a way to do more multiple things at once, it’s also an attempt to make your assistant part of your daily habits. Voice analytics companies and the makers of assistants believe adding an assistant to daily habits is the best way to make voice apps part of your habits.
Like Siri Suggestions and Bixby Routines, which Samsung introduced last month, the Android mobile operating system tracks the way people use smartphones and proactively makes suggestions about things to do. Suggested actions on screen have become part of the experience with Google Assistant on Pixel Stands. If suggestions were extended to Google Assistant Routines, things could get interesting.
Even if a person chose not to add the suggested routine, it can help make people aware of what’s possible.

9. Better control of photos on smart displays

The digital frame is one of the best parts of having a smart display. It’s a great way to see your latest photos. It puts to use the stream of shots you have in your smartphone. Place it in your kitchen and when you aren’t asking questions or setting timers, you can see great recent memories or pictures of the people you love.
You can also create a shared album and collaborate with family or friends. Then there’s Live Albums, which uses facial recognition to identify people whose faces you want automatically added.
Despite this great value, there’s room for improvement.
Gestures should be introduced that take advantage of the smart display touchscreen and allow you to remove photos you don’t want to see or add ones you enjoy to your favorites with a swipe. This sort of pruning process is especially necessary when you’re dealing with a Live Album or shared album.
This would make shared albums more appealing, and allow you to curate as needed on the fly without having to pick up your phone and try to find that specific photo in that specific album in order to take action.

10. Add multiple items to lists at once

Amazon brought this to Alexa last fall, and it’s the sort of thing that once you experience, you begin to wonder why it didn’t always exist. Being able to say “Alexa, add eggs, kale, and bacon to my shopping list” is really helpful, and a lot better than needing to say each of these things one at a time.
The reason this matters is that it’s one of the most consistent and reliable parts of the Google Assistant experience.
To be clear, you don’t have to buy things from Google Express, the company’s less-than-great attempt at ecommerce. Making a shopping list just lets you stop thinking about things you need from the store and just blurt out items for your shopping list whenever they cross your mind.
As other members of your household get into the same habit, the combined list is ready whenever you make your next trip to the store.
As part of my job as an AI staff writer at VentureBeat, I’ve experienced virtually every Google Assistant feature, and next to setting timers, listening to the news or music, and checking the weather, making shopping lists is what I use most often.
And it’s way more effective when you have it connected with your favorite productivity app. For me, that’s Wunderlist. Google began to connect lists with apps like Any.do, Google Keep, and Bring as part of a series of updates last fall. Thankfully, you can connect apps like Wunderlist with your Google Assistant via IFTTT.
As part of that update, you can even create and edit lists with Google Assistant, but you cannot yet rattle off a list of items that come to mind. Adding that ability would also make people’s lives easier.
Source: VentureBeat

Follow Us On Facebook, Twitter & Instagram Please Share Your Stories, Press Release & Articles At [email protected]. To Read More News Daily, Subscribe To Our Push Notification at https://www.inventiva.co.in/

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button