Pay to AI
There's an incredible amount of hype surrounding the tech these days, and the investments keep pouring in. But as they currently stand, chatbots have an Achilles heel that could turn them into a major headache in the long run, as The Washington Post reports: they cost a huge amount of money to run.
Every time a user asks these chatbots something, it costs the companies running them money. In plain English, they're extremely expensive to run, with no clear revenue model in sight.
In other words, the future of generative AI is uncertain as ever as companies are competing in a race to keep soaring costs down while attracting new users — who in turn cost even more to service.
Even with subscription-based models like OpenAI's $20-dollars-per-month ChatGPT Pro plan, users are limited to just 25 messages every three hours, highlighting just how computing-intensive the process is.
It's a reality AI companies are painfully aware of, especially as supplies of much-needed computer chips — graphics processing units (GPUs) in particular — dwindle.
"We try to design systems that do not maximize for engagement," OpenAI CEO Sam Altman told members of Congress during a Senate hearing last month. "In fact, we’re so short on GPUs, the less people use our products, the better."
As costs soar, even some of the largest players in the tech world like Google are choosing to scale their operations down by focusing on large language models that are much smaller than others currently available, according to WaPo.
Where that leaves the future of the market remains to be seen.
"Eventually you come to the end of the hype curve, and the only thing your investors are going to look at, at that point, is your bottom line," Tom Goldstein, a computer science professor at the University of Maryland, told the WaPo.
Nonetheless, companies will likely keep trying.
"Even though it's expensive," Goldstein added, "it’s still far less expensive than human labor."
More on AI chatbots: OpenAI Competitor Says Its Chatbot Has a Rudimentary Conscience