Connect with us

Entertainment

Google Search AI Overviews at 6 months: Is the feature getting better?

Published

on

It’s been six months since Google started adding AI-generated text to the top of many Google Search queries by default, and this experiment — that’s what a disclaimer at the bottom of each AI Overview says it is — hasn’t entirely been a rip-roaring success, Google acknowledged to Mashable.

While “AI overviews on balance and at large are very compelling sets of things that are helpful to the users,” said Hema Budaraju, Google’s senior director of product management for Search, “we have work to do on the quality side of things, which is an ever growing need.”

AI Overviews launched with a slogan of sorts: “Let Google do the searching for you,” but after some controversy at the start — notably that couple of weeks where stories kept coming out about Google Search telling people to eat rocks and put glue on pizza — the company appears to have pulled back a bit. At launch, AI Overviews showed up in about 15 percent of Google Search results pages, but that number was reduced to about 7 percent by the end of June, according to Search Engine Land.

So has quality improved over the past six months?

Are AI Overviews getting better?

It would be hard to argue point-blank that there’s been a significant uptick in quality. Overviews materialize less often, and errors are still rampant, but I did find some very limited evidence of improvements: the AI Overviews for the queries I highlighted to Google for this article all improved while I was working on it.

For what it’s worth, Budaraju says, across all types of queries, from the everyday to the weird, AI Overviews work, “especially when there is no single answer and it’s multiple perspectives.” Or at least that’s what Google thinks based on internal data about quality, which comes from A-B testing, but not focus groups, Budaraju said.

Quotidian searches tend to get acceptable AI Overviews in my experience. “What do almonds taste like,” for instance, may produce a reasonable AI Overview like the one I got: “Almonds can taste sweet, slightly bitter, or bitter, depending on their chemical composition.” Fine.

But if you’re an information fiend who uses Google Search more expansively, there’s a good chance you still encounter bizarre errors. This November example from BlueSky user @coopercooperco is a decent summary of Google Gemini’s unfortunate lingering tendency to put the truth in a blender from time to time.

my friend @craigbased.bsky.social made a comment about Cole kissing Shelly so I googled “what episode does Gordon Cole kiss Shelly” and that’s what it gave me. Today it gives a slightly different, also wrong answer. We all know deathly serious Gordon Cole would never do something like kiss a woman.

[image or embed]

— F♯A♯∞, fka ☕️ (@coopercooperco.bsky.social) November 26, 2024 at 8:59 AM

When queried for the Twin Peaks episode where Cole kisses Shelly, the AI Overview blurts out quite confidently and wrongly that there is no such scene. Without knowing with any certainty what went wrong, one can only assume the model’s training data includes at least fleeting mentions — if not the full script — of the famous Twin Peaks scene about (David Lynch shouting voice) “two adults sharing a tender moment!” in which Cole and Shelly are seemingly interrupted by Bobby Briggs, but then clearly and unambiguously do kiss. The model likely isn’t drawing from any faulty blogs or counterfeit scripts saying Cole never kisses Shelly (To what end would anyone write such a thing?). It’s just making this up and sticking it at the very top of the Google Search results page.

The Bluesky user above is clearly making what Google frequently calls an “uncommon query.” Hallucinations “tend to arise” when the query is uncommon, Budaraju said. “Even though the systems are trying to be helpful, there is some misinterpretation, some inherent lack of high-quality information on the web,” she explained while speaking to Mashable about AI Overviews in general, not this particular one. Plenty of prominent, high-quality information online confirms that Cole and Shelly kiss, so “misinterpretation” of Bobby Briggs’ unsuccessful interruption makes more sense as an explanation.

If you search based on faulty information, AI Overviews may make things significantly worse

According to Budaraju, improving AI Overviews involves “sentiment surveys” that are not exactly A-B tests. “We just give people an option to choose between one versus the other and get their expression of satisfaction,” she said. 

Mashable Light Speed

But a nightmare scenario for AI Overviews is one in which a searcher starts out with less-than-perfect information, and the AI Overview makes it even less perfect.

If the basis for a search is wrong or flawed, and the AI Overview doesn’t catch the problem, then it stands to reason the user won’t notice it either. The result would be a satisfied user who is now even more ignorant than before. Admittedly, the problem of using Google Search to find misinformation is much older than AI Overviews, but AI Overviews could be a formula for supercharging this process.

For a vivid-but-fairly-benign example of what I mean, here’s the result for the query “How to use baking soda to thicken soup.” Someone might only have the fuzziest notion that one of the powders in the cabinet can give their chowder a heartier mouthfeel, but they might guess wrong. According to the AI Overview, “Baking soda can be used to thicken soup by making it silkier and smoother.”

A Google Search user seeking confirmation that baking soda can thicken soup, and receiving it from an AI Overview


Credit: Mashable screenshot via Google

This won’t work, and has the potential to make your soup taste weird.

When I showed this example to a Google representative, they told me Google would use it to improve their product.

But separating good and bad information becomes more of a muddle if you’re searching for the paranormal. For instance, I tried searching “how to teach a dog to communicate telepathically,” and the AI Overview had began with the heading “Here are some tips for communicating with your dog telepathically,” and then provided a bulleted list cobbled together from the writings of believers in the paranormal, like “animal communicator” Pea Horsley.

A Google Search user seeking information about how to communicate with a dog telepathically, and receiving an AI Overview that begins "Here are some tips for communicating with your dog telepathically"


Credit: Mashable screenshot via Google

If you’re inclined to read them, it’s Google Search’s job to steer you to the writings of people like Horsley — in fact, I recommend them. They’re entertaining. But when the AI Overview at the top of a Google results page reads “Here are some tips for communicating with your dog telepathically,” it gives the users the impression that this information is authoritative and trustworthy, rather than being “for entertainment purposes only.”

A Google representative pointed out that AI Overviews are dynamic. They showed me their AI Overview for the same search, and it didn’t say “Here are some tips for communicating with your dog telepathically,” but instead mentioned that there’s no scientific evidence that dogs can communicate telepathically, before transitioning into another Pea Horsley-influenced list of instructions. If I try this search today, I get a similarly improved result.

Finally, what if a user noticed that cow meat is called “beef,” and pig meat is called “pork,” and wondered what dolphin meat is called. Stranger things have happened. When I used Google Search to find answers, the AI Overview seemingly let slip the dark truth about mahi-mahi:

A Google Search user searching "dolphin meat name" and receiving the anser mahi-mahi"


Credit: Mashable screenshot via Google

The AI Overview begins “The name for dolphin meat depends on the region and the type of dolphin” and then provides a bulleted list. The first item on the list is “Mahi-mahi.”

If the user reads on, they’ll see that mahi-mahi is also known as “dolphinfish” (because, to be clear, mahi-mahi is not dolphin. It’s a fish). But the result is confusing to say the least. When I showed it to a Google representative they told me this was a reasonable interpretation of the search — in other words that a user searching for “dolphin meat name” really might be looking for the fish known as a “dolphinfish.”

It’s a good idea to click the source

Since, as I mentioned above, every single one of the searches that produced a problematic AI Overview that I featured here improved to some degree, I suspected Google was cleaning them up as I went along, but Budaraju claims otherwise. “We don’t fix queries one by one. That’s not how we operate. We actually think about it as what are the patterns of issues that we’re seeing, and how would we actually solve them at scale?

But she also told me Google remains focused on steering users toward the sources of AI Overviews — y’know, the old fashioned links on your Google Search results page? “To some extent,” she said, “I think we are also hoping that our users have the right links, links for them to also pursue.” She wonders if, in response to an AI Overview, the user would “actually pursue that path and look at the links that led to the overview that you’ve created.”

If AI Overviews are never going away, then until they never hallucinate, it’s probably a good idea to take Budaraju up on this suggestion, and cultivate a habit of clicking those links next to your AI Overviews whenever you see them.

Advertisement Find your dream job

Trending