Tech

Hey Google!

What happened to my assistant?

Google Assistant is bugging out
Bronson Stamp / Sherwood

Google Assistant is bad now. But why?

They promised us flying cars, and all we got were mediocre virtual assistants that seem to be getting worse.

Recently I brought Google Assistant to its cognitive limits by asking it to tell jokes to my toddler.

Our Google Nest Hub Max got through three jokes before repeating the one where the left eye says to the right eye, “Between you and me, something stinks.” When I prompted it to tell us another, the same way I had previously, it said it didn’t understand. Another what? 

These days Google Assistant seems more and more confused.

I first adopted a number of smart-home devices in 2018, when Google (and Amazon and Apple) were hyping their voice assistants nonstop and it felt like every developer conference or trade show Google was a part of was meant to indoctrinate you into the ways of the Assistant

Back then, Google Assistant was pretty good and getting better. There was a high-water mark for Google’s virtual assistant, perhaps a year or two ago. I would ask and Google would do a decent job answering. Now I’d say one out of three times it gets it wrong.

It frequently gives me the weather for the wrong town, despite having my address in its settings. When I ask it to play music on both Google speakers, it only sometimes obliges. The same goes for the TV and lights. There’s a very specific kind of shame that comes from arguing with a smart assistant for five minutes before getting off the couch.

It’s not just me. Plenty of Google’s more than 500 million monthly active users have noticed too. Reddit is full of people complaining that Google Assistant is no longer as helpful as it used to be and seems to be degenerating. The same thing seems to be happening to Amazon’s Alexa and Apple’s Siri.

As Computerworld’s JR Raphael put it last summer: “It's not only the apparent lack of focus, emphasis, and ongoing investment internally around the service. It's an apparent deterioration of the existing functions that made Assistant worth relying on.”

What's going on? 

Google and Amazon have said they’re still committed to their assistants, but both are in the process of making them into something fundamentally different than what they were — by implementing generative AI.

The previous generation of assistants that we know and love(d) worked by using natural language processing, or NLP, to detect what you were trying to say, matching that to a predefined intent model, then spitting out a predetermined and vetted answer.

“You could say anything you wanted on the input, but the output was always going to be something that existed,” Bret Kinsella, founder and CEO of voice tech and AI publications Voicebot.ai and Synthedia, explained. “Which means that you had to have forethought about everything everybody would ask. And then you had to come up with some sort of canned response.”

If the assistant figured out you were asking about the weather, it would read to you from the weather app. If you asked it a question, it would pull an answer straight from a top Google search result or Wikipedia. If it didn’t know, it said so.

This certainly had its limits, and assistants were terrible conversationalists, but they got the job done.

The large language models, or LLMs, on which newer generative AI assistants are based, use machine-learning algorithms to go through large amounts of text and then output a probable answer that sounds much more like a human response.

“A generative AI bot will respond to your question by reviewing 5 to 10 webpages, and then synthesizing that information. Its word-prediction algorithm will produce a response based on those webpages using the model's training on human language to form the response,” Kinsella said. “It will respond to you like it's an expert. That's a much richer experience that results in an answer complete with source references, compared with 10 blue links and the expectation the user wants to be an information archeologist.”

Unless, of course, the generative AI makes up nonsense

Google didn’t respond to several requests for comment about the decline of their assistants. Neither did Amazon or Apple.

What we know is that Google and Amazon made big cuts to their assistant teams after the tech slump in 2022 and after years of losing money. That was also right about the time ChatGPT launched and changed the game for what interactions between human and machine might be. In other words, it’s been a one-two punch for existing assistants.

Axios reported on an internal Google email last summer saying that the company was reorganizing its assistant team to create a “supercharged Assistant, powered by the latest LLM technology.” That process has begun with Google’s Android mobile phones, where you can now opt in to use Gemini (Google’s answer to ChatGPT, formerly called Bard) as your main assistant. (Android Authority reported that new downloads of Assistant now come with Gemini by default.) Gemini can do things like generate captions for pictures you take but, so far, Gemini is having trouble doing some of the basic Assistant functions. So it looks like Google Assistant still has to handle bread-and-butter requests like setting timers and controlling your smart devices. 

At its hardware event in September, Amazon previewed a “smarter and more conversational Alexa” that would be powered by generative AI. So far Amazon has revealed three AI-powered Alexa experiences, Character.AI (chat with fictional characters and historical figures); Splash (make your own music); and Volley (20-questions game). Fun, but not exactly a game changer yet, especially if older capabilities aren’t working as well as they once did. 

Apple, outwardly lagging the other tech giants, is reportedly in talks with Google and Baidu to outsource generative AI for its iPhones.

“I would be surprised if by this time next year all of the home assistants are not drastically better.”

It seems we’re in some liminal state where these tech companies are promising a better future with assistants supported by generative AI, but at the same time the existing technology that hundreds of millions of people use — and some, including those who are visually impaired, rely on — has languished. It’s also possible that in the decade since these smart assistants first came out and since the advent of generative AI like ChatGPT, people’s expectations for communicating with technology have gotten higher. 

Ishan Shah, a founding engineer at an AI startup who for fun recently built a generative AI bot that completes actions for you online, said newer LLMs are superior at figuring out what we want than older NLP models. When these generative AI assistants are fully overlaid onto existing ones — the ones that are connected to your smart speakers, lights, and TV — the experience could be “extremely powerful,” as these tools strike an ideal balance between creativity and competence.

Shah said bridging the old and new technology won’t necessarily be that hard, but it will take time for big companies to complete.

“I would be surprised if by this time next year all of the home assistants are not drastically better,” he added. 

But we’re definitely not there yet. For now, my toddler has suggested we unplug Google Assistant and plug it back in.

More Tech

See all Tech
tech

Google rolls out Private AI Compute matching Apple’s AI privacy scheme

One of the barriers to people embracing AI in their daily lives is trust — making sure that the company that built the AI isn’t going to just spill your most sensitive info to advertisers and data brokers.

Google is announcing a new feature called Private AI Compute that takes a page from Apple to help assure users that Google will keep your AI data private.

In June 2024, Apple announced its Private Cloud Compute scheme, which ensures only the user can access data sent to the cloud to enable AI features.

While Apple’s AI tools have yet to fully materialize, Google’s new offering looks a lot like Apple’s. AI models on its phones process data in a secure environment, and when more computing is needed in the cloud, that security is extended to the cloud to be processed by Google’s custom TPU chips.

A press release said: “This ensures sensitive data processed by Private AI Compute remains accessible only to you and no one else, not even Google.”

In June 2024, Apple announced its Private Cloud Compute scheme, which ensures only the user can access data sent to the cloud to enable AI features.

While Apple’s AI tools have yet to fully materialize, Google’s new offering looks a lot like Apple’s. AI models on its phones process data in a secure environment, and when more computing is needed in the cloud, that security is extended to the cloud to be processed by Google’s custom TPU chips.

A press release said: “This ensures sensitive data processed by Private AI Compute remains accessible only to you and no one else, not even Google.”

315M

Amazon says it has 315 million monthly active viewers for its Prime Video ads, according to Deadline, up from 200 million in April 2024. The number comes just a week after Netflix said it had 190 million monthly active viewers.

The self-reported numbers have different methodologies. Netflix counts the number of ad-tier subscribers who’ve watched at least one minute of ads per month and multiplies that by its estimated household size. Amazon’s number represents an unduplicated average monthly active ad-supported audience across its programming from September 2024 through August 2025.

The services themselves also aren’t exactly comparable. Netflix charges $7.99 a month for its ad-supported tier, while Prime Video comes bundled as part of Amazon Prime — and now automatically comes with ads unless consumers pay an extra $2.99 per month to remove them.

1.6M

Chinese EV maker and Tesla competitor BYD could sell up to 1.6 million vehicles abroad next year, according to a new report by Citi published by Reuters. That’s potentially 60% more than the roughly 1 million vehicles BYD is expected to sell outside China this year. That’s also the same number analysts polled by FactSet expect Tesla to sell in total in 2025.

tech

Apple reportedly considers adding additional camera to iPhone Air and pushing next release to 2027

Apple is delaying its next iPhone Air to the spring of 2027, from the fall of 2026, as it potentially rejiggers the model to include a second camera lens, according to The Information. Consumers have largely overlooked Apple’s latest, thinnest phone, choosing instead to buy the standard and Pro models, thanks in part to the Air’s single camera and relatively weak battery life. The preference caused Apple to greatly scale back production for its Air model.

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.