A Dumb Reality of ChatGPT Could Spell Big Trouble for OpenAI

Pay to AI The AI chatbot wars are well underway, with OpenAI and various would-be competitors — both large and small — vying for the number one spot in the industry. There’s an incredible amount of hype surrounding the tech these days, and the investments keep pouring in. But as they currently stand, chatbots have an Achilles heel that could turn them into a major headache in the long run, as The Washington Post reports: they cost a huge amount of money to run. Every time a user asks these chatbots something, it costs the companies running them money. In plain English, they’re extremely expensive to run, with no clear revenue model in sight. In other words, the future of generative AI is uncertain as ever as companies are competing in a race to keep soaring costs down while attracting new users — who in turn cost even more to service. Chips Ahoy Even with subscription-based models like OpenAI’s $20-dollars-per-month ChatGPT Pro plan, users are limited to just 25 messages every three hours, highlighting just how computing-intensive the process is. It’s a reality AI companies are painfully aware of, especially as supplies of much-needed computer chips — graphics processing units (GPUs) in particular — dwindle. “We try to design systems that do not maximize for engagement,” OpenAI CEO Sam Altman told members of Congress during a Senate hearing last month. “In fact, we’re so short on GPUs, the less people use our products, the better.” “GPUs at this point are…A Dumb Reality of ChatGPT Could Spell Big Trouble for OpenAI

Leave a Reply

Your email address will not be published. Required fields are marked *