Opinion | OpenAI is selling its new chatbot as a flirty and obedient female companion

The artificial intelligence company OpenAI is rolling out the newest ChatGPT model, which has the capacity to use voice capabilities to hold conversations with users in real time. The voice technology is shockingly sophisticated — it responds to user speech by convincingly mimicking the speed, emotional inflections and idiomatic language of a human. The chatbot also has the capacity to recognize objects and images in real time, and, in demonstrations, OpenAI developers propped up their phone and prompted the chatbot to comment on the user’s surroundings as if it were video-chatting with a friend.

OpenAI’s unveiling of GPT-4o has also generated buzz — and raised eyebrows — because the company has marketed it as a flirty, female companion. OpenAI CEO Sam Altman posted the word “her” on X ahead of the unveiling, an apparent reference to the 2013 film “Her” directed by Spike Jonze. In that film, Joaquin Phoenix, who plays a lonely writer going through a divorce, falls in love with a charming, superintelligent AI personal assistant voiced by Scarlett Johansson. One can’t help but notice that GPT-4o’s voice sounds a bit like Johannson’s. During the demonstrations, GPT-4o was consistently playful, giggly and even flattered users’ appearance. In the middle of solving an algebra problem for a user, it said, “Wow, that’s quite the outfit you’ve got on.” The comment was so overtly provocative that news reports have described the interaction as “flirtatious-sounding” and as a “come-on.”

It’s all a little creepy, and it raises questions about whether the development of this kind of technology will prey on human vulnerabilities and reinforce some of our worst instincts as a society.

Altman is inviting the public to crave a world like the one depicted in “Her.” But it’s not exactly a happy story. “Her” is a spooky tale that illustrates how advanced AI is an inadequate salve for loneliness. Phoenix’s character has verbal sexual encounters with his AI, but is unable to have a physical connection. He believes he has a unique romantic connection with the voice played by Johannson but discovers that “she” is in fact having conversations with thousands of other users simultaneously — and falling in love with many of them too.

At the end of the film, the Johannson-voiced bot leaves Phoenix’s character to venture off elsewhere with other AIs who can operate at its computational speed, and the human character is bereft and left to seek fulfillment in the real world with other humans. Viewers may differ on whether his detour away from humans was a net good or not, but the movie showcases the limitations and diversions of connecting with AI instead of real people.

GPT-4o is not a fraction as advanced as the AI character in “Her,” but it’s not hard to see how people who don’t understand how it works — particularly if they’re emotionally vulnerable — may be prone to projecting sentience onto the chatbot and seek substantial companionship with it. (And if not now, then at least at some point in the not-too-distant future, given the breakneck pace of innovation.) Some may be hopeful about the idea of bots as providing some kind of companionship to people, but we as a society have been flat-footed in terms of educating people about how these tools work and the trade-offs they present.

What we do know is that it is in Big Tech’s interest to have people turn to their products instead of toward each other. And we do know that these companies’ profit margins benefit from convincing us to use bots as a Band-Aid solution for feeling lonely or abandoned, instead of working together to counteract the kinds of political, economic and social forces that corrode community, limit our freedom, and incentivize addition to screens.

GPT-4o’s coquettish female voice also raises questions of whether this technology is insidiously reinforcing patriarchal gender norms. We should pause and reflect on the mass production of what may be the most humanlike AI voice technology to date taking on the sonic qualities of a flirty woman whose job it is to meekly take commands, permit endless interruption of its speech without complaint and then reward the user with endless affection — and borderline sexualized attention. Those might’ve been the expectations male executives had of their personal assistants in the 1950s, but it’s not where we’re at as a society today. We should be wary of the kinds of fantasies OpenAI wants to nurture — and question whether they’re really taking us forward.

This article was originally published on MSNBC.com