Back in 2022, what is Google Assistant?

The high – what I would consider – Assistant takeover was when Google Assistant was listed after Android in the OS section of the Pixel 3’s tech specs. A few years later, things have changed and the obvious direction for Google Assistant in 2022 is to decline.

Assistant in 2023

That cutback was made clear this year after the assistant had already had a quiet 2021 (with foundational progress still in the wings). The first warning sign was Google shutting down Snapshot Assistant, which had implications for what made Google Now so promising as a centralized, personalized feed that could hold unhidden information from apps.

The second was the removal of the Google Assistant driving mode “dashboard” which – in May of 2019 – seemed like a major coup by Assistant on (automatic) Android. It took a long time to launch and has been deprecated ever since to get a Google Maps experience first.

Pixel 3 phone specifications for 2018

Finally, after more than a year of work, Google halted Assistant memory development in June. It was seen to speed up saving content and reminders on Android. This was a good idea because the Collections feature in the Google app is frustrating (with a poor UI) and would have been a welcome successor.

Besides the three feeds, an important commonality between Dashboard, Snapshot, and Memories was how they were all attempts to extend Assistant from its audio roots. They were full of the Assistant brand, but in the end they had nothing to do with the Assistant’s core and original competence. Their primary claim to being useful was the centralization of relevant information, but there is nothing inherently helpful about that.

Meanwhile, a report from the information In October, he explained how Google was “investing less in developing Google Assistant for voice search for cars and devices not made by Google.” This is framed as a focus and experience improvement over Google’s first-party products, while it comes as the company is cutting costs by consolidating various efforts.

The most significant consequence of this was that Sense 2 and Versa 4 did not appear with Google Assistant even after it was introduced on the previous generation of devices. This is also the case on non-Pixel and Samsung wearables at launch. It’s strange that there are only Wear OS 3 devices with Amazon Alexa. The “least important” investment area was said to be “helper for Chromebooks”.

The assistant is no longer…

In its heyday, Assistant seemed to be the connective tissue between all of Google’s form factors, if not the main way you interact with those devices. The basic interface for smart speakers and headphones was straightforward enough, but Google then tried to give assistant user interfaces a try, starting with the smart display. It still seemed like a good idea, but — in hindsight — the assistant replacing Android Auto for phone screens didn’t make much sense. Yes, voice input is an essential part of getting assistance while driving, but Google could have left the touch user experience to Android.

Another prime example of this is how the original Reminders experience, which is set to be replaced by Tasks with Assistant Integration, has had a horrible phone-based UI for years. The voice interaction was good, but it is not clear if the assistant has visual experience, and it is best to let the existing teams continue this work.

At I/O 2019, when it was also announced that the new Google Assistant running on the device along with driving mode would be the company, it seemed like the company wanted Assistant to be another way to control your phone. There are times when this is very useful, but it will never be the primary method of interaction.

Having lived with this next-generation assistant for a few years now, which hasn’t yet expanded beyond Pixel phones, it hasn’t changed the way I navigate my device, something Google was pushing on stage at one point. There is a touchscreen for a reason and taps are likely to be the fastest way to do things for the foreseeable future (even smart glasses). Another reason could be the lack of support from apps and allowing voice control in them.

If anything, the takeaway from the on-device assistant is how that work has led to voice assistant writing on Pixel phones, and how the technology is best applied to focused experiences, like transcription and editing in Gboard.

Where does the helper go?

Among the Assistant announcements Google has made this year, the obvious trend has been to improve the core voice experience. On the Nest Hub Max, you can now summon the Assistant by simply staring at the smart display thanks to the camera’s Look and Talk feature, while quick phrases let you skip “Hey Google” to preset commands.

At I/O 2022, Google also announced how in early 2023 Assistant will ignore “umm,” natural pauses and self-corrects as you issue a command. As with the two new ways to activate the Assistant, the goal is to make the experience more natural:

To achieve this, we are building new and more powerful models of speech and language that can understand the nuances of human speech – such as when someone pauses, but has not finished speaking. We’ve come close to smooth real-time conversation with a Tensor chip, which is specifically designed to handle on-device machine learning tasks at breakneck speed.

Meanwhile, Google is using AI to improve the accuracy of smart home assistant commands across three areas.

Understanding natural language is what drives the assistant. It’s gotten so much better in the past decade or so that it’s easy to forget about those advances. Of course, this is because our expectations are constantly growing and it seems like the original Google Assistant was trying to do the same with its reach. There were, of course, improvements to the core audio experience as the Assistant spread its alignment across all of Google’s consumer-facing products, but this seems to be a case of trying to do too much, too quickly.

Google’s constant challenge is to move R&D out of the lab and give it a good end user experience. Assistant could be at the forefront of that for Google, especially with the arrival of new form factors. However, trying to replace what works didn’t work out so well, but at least now the company finally realizes that.

FTC: We use affiliate links to earn income. more.


Check out 9to5Google on YouTube for more news:

#Google #Assistant

Leave a Reply

Your email address will not be published. Required fields are marked *