Users Report Google’s AI Overview Thinks Tons of Fictional Characters Are Gay

Jakub Porzycki/NurPhoto via Getty Images

Listen, we’re big fans of underrated and lesser-known LGBTQ+ fictional characters here at Them, but we have to be the bearer of bad news: despite what Google’s new “artificial intelligence” bot might have told you, queer Star Wars icon Slurpy Faggi does not actually exist.

Google has spent the last few weeks rolling out a suite of new AI tools including “AI Overview,” which theoretically uses a generative large language model (LLM) to summarize top search results. Users immediately started reporting major problems with the tool, including results that encouraged them to eat rocks and glue, contained instructions to create mustard gas instead of cleaning solution, and speculated that people with dark skin could stare into the sun for up to 30 minutes.

It was only a matter of time before AI Overview’s wild assertions affected search results for pop culture, too. This week, X (formerly Twitter) user @computer_gay shared an image of his search results for the query “are there gay star wars characters,” which began with an AI Overview response informing them of Star Wars’ long-forgotten gay pioneer: Slurpy Faggi, alleged to be in a committed relationship with his boyfriend, Dr. Butto.

The screen capture is of dubious authenticity, with some on social media pointing out that because the phrase “Slurpy Faggi” has never appeared online before, it’s unlikely Google’s LLM could have pulled it from a website. Despite Slurpy Faggi and Dr. Butto absolutely sounding like monikers J.K. Rowling would give to gay Star Wars characters, neither of these two characters with parodically homophobic names actually exist (outside of the many, many memes that have already been created celebrating the Faggi-Butto union). As fans of the series may be aware, despite the films’ general lack of LGBTQ+ representation, official Star Wars video games and comic books have introduced several queer and nonbinary characters.

AI Overview has, however, apparently dabbled in some Pokémon and Mario Kart-themed shitposting. In results identifying LGBTQ+ characters in those series (which seem to be cribbed especially from humor articles originally published in queer publications like Autostraddle and Out), users reported that Overview tried to help by offering representation like Koopa Troopa (“a trans man who was dishonorably discharged from the military”) and Bulbasaur (“a plant-loving queer who often says ‘Mother Earth’”). (Once again, as fans may be aware, both the Mario series and Pokémon do contain canonically LGBTQ+ characters.)

These results aren’t always replicable, and some users have posted images of AI Overview results with different, more accurate findings using the same search terms. But the problem of false information in Overview being drawn from humorous or comedic material, is far-ranging. My own brief test searches earlier this week showed that Overview will sometimes assert that “piss is a vegetable” with a “salty flavor,” based on a 2023 post on Cohost.org (which itself was lampooning the social media site’s disproportionate ranking in Google search results).

Tempting as it may seem, though, don’t jump straight to Google and try to generate even more wacky search results in hope of becoming a meme lord. Overview’s “I definitely read the book for this report” statements lose a lot of their charm when compared against the technology’s massive environmental and ethical harms. One report last year estimated that even a five-query “conversation” with OpenAI’s ChatGPT can consume around 16 ounces of water, and tech companies including Google and Microsoft have reported dramatic increases in global water usage associated with their race to develop their own AI tools (such as Recall, a forthcoming feature on Microsoft’s Copilot+ computers that wants to take frequent unprompted screenshots of your activities). Meanwhile, information about the data used to train these programs remains unclear, but concerning; an investigation by the Stanford Internet Observatory last year found that hundreds of pieces of child sexual abuse material were included in LAION-5B, a dataset that was later used to train popular generative tools like Stable Diffusion.

Thankfully, using Google’s Web search — rather than the default “All” results setting — seems to be a good way to avoid Overview altogether, at least for now. So long, Slurpy.

Get the best of what’s queer. Sign up for Them’s weekly newsletter here.

Originally Appeared on them.