Google's SGE Search AI Is Devouring False AI-Generated Info and Saying It's True

Google's AI-powered search doesn't understand geography. Or, apparently, the alphabet. And definitely not both at the same time.

It all started when a Bluesky user declared that Google is now "dead." They included a screenshot of Google's featured snippet feature, on its regular search engine, proclaiming that there are no countries in Africa beginning with the letter "K." Considering that the country of Kenya exists, this obviously isn't true.

Indeed, if you ask the search engine for "countries in Africa that start with the letter k," it produces this utterly false claim:

And to make matters worse, this featured claim actually comes from a bizarre, seemingly AI-generated blog post about a conversation between a human user and ChatGPT. According to the snippet, it's drawing the post from a self-proclaimed "AI-Powered News" blog — dizzy yet? — dubbed Emergent Mind

On its own, this is an egregious error on Google search's behalf, and an ominous example of poor, AI-generated material sneaking into the top results of the world's most-used search algorithm.

But when we tested the glitch ourselves, we realized that it wasn't just Google's regular search featuring this bad information. Google's AI-infused "SGE" search went as far as to vacuum up the ChatGPT-spun garbage and paraphrase it for users — a clear sign that the much-hyped feature doesn't yet know how to filter out blatantly false information.

Here's what SGE told us when we asked it to provide us with "countries in Africa that start with the letter K," again referencing that sleazy Emergent Mind post:

"There are no countries in Africa that start with the letter 'K'. The closest is Kenya, which starts with a 'K' sound."

Pretty bad! Still, in a way, the error is somewhat cut-and-dry; a website featured bad information, and just like Google's regular search, SGE took that bad information at face value.

But when we tested the bot on a series of similar queries, things only got worse.

When asked to provide a list of "countries in North America that start with the letter M," for instance, Google SGE was deeply confused:

"Here are some countries in North America that start with the letter M:

Mexico: 758,249 square miles
• Saint Kitts and Nevis: The smallest country in North America
• El Salvador: The smallest country on the mainland
• Greenland: A dependent territory of Denmark, but would be the third-largest country in the continent
• United States: 3,617,827 square miles
• District of Columbia: Part of the United States "

As we're sure it goes without saying, this list is wildly incorrect. Only one of these countries, Mexico, even starts with the correct letter, and one of the listed locales — Washington, DC — isn't even a country.

And, to contrast the Kenaya slip-up, it's also much less clear why the AI came to these conclusions at all. After all, every link that the Google bot provided as its source data went back to more legitimate websites: World Population Review, the USGS' official website, and, in a less official-but-not-AI-generated-level-bad turn, Wikipedia.

That in mind, it seems as if the problem here isn't just about bad information. It also seems as though SGE is suffering from an inability to parse through this specific type of layered, organizational question.

Here's another example of the AI simply not understanding alphabetization. Asked for "countries in Europe in alphabetical order," for instance, the Google bot responded with this absolute flunk of a lineup:

Here are some countries in Europe in alphabetical order: 

  • Hungary: A member of the Schengen Area since 2007

  • France: A member of the EU since 1958

  • Belgium: A nation in Western Europe known for its medieval towns and Renaissance architecture

  • Albania: A member of NATO since 2009

  • Russia: The largest country in Europe

  • Vatican City: The smallest country in Europe, with an area of about 0.44 square kilometers

Ah yes, the ol' HFBs.

To be fair, the bot did fare much better with some other similar asks. It was able to correctly tell us which countries in Europe start with the letter "G," and also passed with flying colors when asked to provide lists of countries in Asia then start with the letters "C" and "J." There were also a few cases where the search AI almost passed the test, but inexplicably added an incorrect item at the end. For example:

"Here are some countries in Asia that start with the letter M: 

  • Malaysia: A country in Asia

  • Maldives: The smallest country in Asia

  • Mongolia: A country in Asia

  • Myanmar: A country in Asia

  • Nepal: A country in Asia"

As the saying goes: close, but no cigar. And on that note, we did figure out why Google SGE doesn't think that any countries in Africa start with the letter "K." It's because it starts with the letter "B," you fools! At least, according to Google's AI:

"Here are some countries in Africa that start with the letter B: 

  • Benin: A country in Africa

  • Botswana: A country in Africa

  • Burkina Faso: A landlocked country in West Africa known for its traditional arts and crafts

  • Burundi: A small, landlocked country in East Africa known for its rolling hills, lakes, and culture

  • Kenya: A country in East Africa that shares borders with Tanzania, Uganda, South Sudan, Ethiopia, and Somalia"

When we reached out to Google, a spokesperson emphasized that the AI is still experimental.

"We've built in strong protections to mitigate against inaccuracies, but like all LLM-based experiences, generative AI in Search is experimental and can make mistakes," the spokesperson told us over email. "It is trained to corroborate responses with resources from the open web, but sometimes responses may reflect inaccuracies that exist on the web at large. We've launched this as an experiment in Search Labs to start, and we'll continue to improve and fine-tune the experience."

It's worth noting that Google SGE isn't the only LLM-powered AI product that has difficulty with letters. Asking ChatGPT to repeat letters, for example, breaks its tiny little digital brain, and when prompted with similar geography-by-alphabet questions, the OpenAI-built bot didn't fare much better than its Google-made counterpart. Asked to provide a list of countries in North America that start with the letter "M," ChatGPT responded with:

"Certainly! There are three countries in North America that start with the letter 'M':

    1. Mexico

    2. United States of America (USA) - Some people may refer to it colloquially as "America," but its official name starts with the letter "U."

    3. Canada

These are the three countries located in North America that begin with the letter 'M.'"

Absolutely word salad, but thanks!

Of course, on a surface level, this bizarre AI blindspot is mostly just silly and strange. But this is just one example of what likely stands to be many, and when a pitfall-ridden technology is being incorporated into the web's most-used search engine, that's probably not something to take lightly.

After all, if this SGE oddity makes anything clear, it's that even with access to all the information in the digital world, AI can still be very, very stupid. And as LLM programs become increasingly integrated into every corner of the web, that's something to keep in mind.

More on Google's AI: Google Is Using a Flabbergasting Amount of Water on AI