Google and Bing put nonconsensual deepfake porn at the top of some search results

Nonconsensual deepfake pornography is just a click away on popular search engines like Google and Microsoft’s Bing.

Deepfake pornography often grafts a person’s face into a real pornographic scene — for example, a famous woman’s face will be “swapped” with an adult star’s face, making it appear that the famous woman is nude or engaged in a sexual act.

NBC News found that deepfake pornographic images featuring the likenesses of female celebrities were the first images Google and other top search engines surfaced in searches for many women’s names and the word “deepfakes,” as well as general terms like “deepfake porn” or “fake nudes.” The searches were conducted with safe-search tools turned off.

Legal experts, advocates and victims say nonconsensual deepfake porn has grown into a crisis, and they’re asking tech platforms to step up where laws and law enforcement have yet to take action. A growing number of states have enacted or introduced laws to govern the use of deepfakes, particularly in elections, but nonconsensual deepfake porn has only continued to spread.

NBC News searched the combination of a name and the word “deepfakes” with 36 popular female celebrities on Google and Bing. A review of the results found nonconsensual deepfake images and links to deepfake videos in the top Google results for 34 of those searches and the top Bing results for 35 of them. More than half of the top results were links to a popular deepfake website or a competitor. The popular deepfake website has cultivated a market for nonconsensual deepfake porn of celebrities and private figures.

Googling “fake nudes” returned links to multiple apps and programs to create and view nonconsensual deepfake porn in the first six results, followed by six articles about high school boys’ allegedly using the technology to create and share deepfake nude images of their female classmates. On Bing, searching “fake nudes” returned dozens of results of nonconsensual deepfake tools and websites before surfacing an article about the harms of the phenomenon.

Bing’s AI chatbot, Copilot, which appears as an option for search results, tells users that it can’t show them deepfake porn. Copilot says, “The use of deepfakes is unethical and can have serious consequences.” But dozens of links to and examples of nonconsensual deepfake porn are a click away on Bing.

The findings underscore the growing prevalence of nonconsensual deep fake pornography and call into question what action technology companies are taking to limit their spread, even as tech companies are pouring resources into AI initiatives like Google’s Colab and Bard and the products from OpenAI, in which Microsoft is a major investor. Google’s lack of proactive patrolling for abuse has made it and other search engines useful platforms for people looking to engage in deepfake harassment campaigns, according to experts.

Google’s core web results don’t have policies about AI-generated content, but Google said its search features, like panels with selected information, don’t allow manipulated media or sexually explicit content. Google’s Play app store forbids “apps determined to promote or perpetuate demonstrably misleading or deceptive imagery, videos and/or text.” However, Google’s Play Store continues to host an app that has previously advertised the creation of pornographic deepfakes.

Google allows deepfake victims to request the removal of such content from search results through a form, but it isn’t proactively searching for and delisting deepfakes itself. The takedown request page says, “We only review the URLs that you or your authorized representative submit in the form.”

A Google spokesperson said in a statement: “We understand how distressing this content can be for people affected by it, and we’re actively working to bring more protections to Search. Like any search engine, Google indexes content that exists on the web, but we actively design our ranking systems to avoid shocking people with unexpected harmful or explicit content that they aren’t looking for.

“As this space evolves, we’re in the process of building more expansive safeguards, with a particular focus on removing the need for known victims to request content removals one-by-one,” the statement continued.

While Google dominates search engine traffic, alternative search engines like Microsoft’s Bing and the independent search engine DuckDuckGo also feature fake nude images of celebrity women in search results. The content featured on top image search results for Bing includes fake nude photos of former teen Disney Channel female actors, and some of the images use pictures of their faces that appeared to be taken before they turned 18. (A reverse image search for some of the faces used in some of the photos returned results that were posted online before the actors turned 18.)

A spokesperson for Microsoft also pointed to a form where victims of nonconsensual intimate imagery (NCII) can report it appearing in Bing search results. In August 2023, Microsoft clarified that it considers sexually-explicit deepfakes to fall under its NCII policy.

The Microsoft spokesperson said in a statement: “The distribution of non-consensual intimate imagery (NCII) is a gross violation of personal privacy and dignity with devastating effects for victims. Microsoft prohibits NCII on our platforms and services, including the soliciting of NCII or advocating for the production or redistribution of intimate imagery without a victim’s consent."

In May, Microsoft President Brad Smith told a group of lawmakers, “We need to take steps to protect against the alteration of legitimate content with an intent to deceive or defraud people through the use of AI,” according to Reuters.

A spokesperson for DuckDuckGo responded to a request for comment about its policies and the images that appeared to feature underage faces and said its primary source for traditional web links and image results is Bing. DuckDuckGo also pointed to its contact information for privacy-related requests.

Adam Dodge, who founded the nonprofit educational organization Ending Tech-Enabled Abuse, has created resources for deepfake victims and victims’ service providers since 2018.

“In my opinion, we couldn’t be making it easier for unsophisticated people to get access to extremely sophisticated tools designed to sexualize and fetishize women and girls online,” he said. “Without the search engines, people would not be accessing the sites and these tools.”

Dodge said he believes the impact of nonconsensual deepfake porn is misunderstood and underestimated, leading big tech companies to fail to take proactive action to stop it. He criticized Google’s response as “shifting the burden to the victim” by making victims find the offending search results and fill out the complaint form.

“It is hard for people to get their minds around the damage that this form of abuse causes, that fake images cause real harm,” Dodge said. “This technology is explosive when it comes to image-based sexual abuse, because it makes victims of everyone. Before, there had to be an authentic image or video.”

Dodge and Carrie Goldberg, a New York lawyer who specializes in sexual privacy cases, both said they believe Google has the technical ability to remove such content from its results, based on its existing ability to remove harmful content like child sexual abuse material and nonconsensual intimate imagery.

“When I first opened my firm in 2014, there were tons of quote ‘revenge porn’ websites. The main way that victims were discovered on those websites was by people Googling their name,” Goldberg said. “It took years for Google to create a policy and a couple years after that required victims to go through this painstaking process of submitting these forms, URL by URL.”

The grueling process Goldberg described for her clients who were victimized by the nonconsensual creation and sharing of real sexually explicit material echoes the process Google has for victims of fake sexually explicit material.

A spokesperson for Google pointed to newer processes for victims of both nonconsensual intimate images and nonconsensual deepfakes. For victims of real nonconsensual images, Google said, there is a way to request that sexually explicit content doesn’t show up in search results for their names at all. For deepfakes, Google said, it has introduced deduplication measures that can remove reuploaded images that had already been flagged. Google said the same deduplication measures exist for other forms of image-based sexual abuse.

With deepfakes, Goldberg said, part of the problem is the ability to rapidly create new material, which Google doesn’t proactively detect and delist.

Goldberg said Google and other search engines could suppress websites that “broker in nonconsensual pornography and deepfakes” from search results, which Google and Bing have done around other issues in the past.

“Search engines are in a position of power. We don’t have a lot of search engines,” Goldberg said. “They can use that power for good or for bad. I want to encourage them to be the hero in this situation."

This article was originally published on NBCNews.com