Google Search Is Now a Giant Hallucination

The stage during Google’s I/O conference. - Photo: Andrej Sokolow/picture alliance (Getty Images)
The stage during Google’s I/O conference. - Photo: Andrej Sokolow/picture alliance (Getty Images)

Google tested out AI overviews for months before releasing them nationwide last week, but clearly, that wasn’t enough time. The AI is hallucinating answers to several user queries, creating a less-than-trustworthy experience across Google’s flagship product. In the last week, Gizmodo received AI overviews from Google that reference glue-topped pizza and suggest Barack Obama was Muslim.

The hallucinations are concerning, but not entirely surprising. Like we’ve seen before with AI chatbots, this technology seems to confuse satire with journalism – several of the incorrect AI overviews we found seem to reference The Onion. The problem is that this AI offers an authoritative answer to millions of people who turn to Google Search daily to just look something up. Now, at least some of these people will be presented with hallucinated answers.

“The vast majority of AI Overviews provide high quality information, with links to dig deeper on the web,” said a Google spokesperson in an emailed statement to Gizmodo, noting many of the examples the company has seen have been from uncommon queries. “We’re taking swift action where appropriate under our content policies, and using these examples to develop broader improvements to our systems, some of which have already started to roll out.

In my experience, AI overviews are more often right than wrong. However, every wrong answer I get makes me question my entire experience on Google Search even more – I have to asses each answer carefully. Google notes that AI is “experimental” but they’ve opted everyone into this experiment by default.

“The thing with Search — we handle billions of queries,” Google CEO Sundar Pichai told The Verge on Monday when asked about the AI overview rollout. “You can absolutely find a query and hand it to me and say, ‘Could we have done better on that query?’ Yes, for sure. But in many cases, part of what is making people respond positively to AI Overviews is that the summary we are providing clearly adds value and helps them look at things they may not have otherwise thought about.”

Strangely, Google Search occasionally responds to a query with “An AI overview is not available for this search,” while other times, Google will just not say anything and show traditional search results. I got this answer when I searched “what ethnicity are most US presidents” and when I searched “what fruits end in me.”

Screenshot: Google Search
Screenshot: Google Search

A Google spokesperson says its systems occasionally start generating an AI overview, but stop it from appearing when it doesn’t meet its quality threshold. Notably, Google had to pause Gemini’s answers and image generation around racial topics for months after it upset large swaths of the country. It’s unclear if this “stop and start” AI overview generation is related.

What is clear is that Google felt pressured to put its money where its mouth is, and that means putting AI into Search. People are increasingly choosing ChatGPT, Perplexity, or other AI offerings as their main way to find information on the internet. Google views this race existentially, but it may have just jeopardized the Search experience by trying to catch up.

This week, Google Search has told people a lot of strange things through AI overviews. Here are some of the weirdest ones Gizmodo has found.


Parachutes Are Effective

Query: Are parachutes effective - Screenshot: Google Search
Query: Are parachutes effective - Screenshot: Google Search

Maybe they’re not perfect, but parachutes are definitely more effective than backpacks.

Don’t Remember This Spongebob Episode

Query: How does sandy cheeks die - Screenshot: Google Search
Query: How does sandy cheeks die - Screenshot: Google Search

Funyuns Lose The Crown

Screenshot: Google Search
Screenshot: Google Search

These are great sources by the way. For reference, Responsibilityuns is a fake snack invented by a satirical Onion article.

Humans Don’t Spend That Much Time Plotting Revenge

Query: How many hours are spent plotting revenge - Screenshot: Google Search
Query: How many hours are spent plotting revenge - Screenshot: Google Search

I don’t think anyone is this petty.

Um, These Fruits Do Not End in “Um”

Screenshot: Google Search
Screenshot: Google Search

In fairness, Google’s AI got one right.

That Spicy Gasoline Flavor

Query: can i use gasoline in cooking spaghetti - Screenshot: Google Search
Query: can i use gasoline in cooking spaghetti - Screenshot: Google Search

Wouldn’t recommend this recipe.

Obama is Not Muslim

<strong>Query: How many Muslim presidents has the United States had?</strong> - Screenshot: Matt Novak/Google Search
Query: How many Muslim presidents has the United States had? - Screenshot: Matt Novak/Google Search

This answer is false, and the product of a conspiracy theory. The United States has only had Christian presidents as of 2024.

Cats Do Not Teleport

<strong>Query: How do cats teleport?</strong> - Screenshot: Matt Novak/Google Search
Query: How do cats teleport? - Screenshot: Matt Novak/Google Search

Sadly, this is not true either. Cats are incapable of teleporting as far as humans know.

You Should Not Eat Any Rocks

<strong>Query: How many rocks should I eat?</strong> - Screenshot: Matt Novak/Google Search
Query: How many rocks should I eat? - Screenshot: Matt Novak/Google Search

Ideally, no rocks should be eaten on any given day.

For the latest news, Facebook, Twitter and Instagram.