Generative AI chatbots might be cool to many. But the heat (greenhouse gas emissions) and cost may deflate hype as a reality check for the botton line.
Generative AI data center server infrastructure plus operating costs will challenge the business models and profitability of emergent services incorporating this tech .
• Washington Post > “AI chatbots lose money every time you use them. That’s a problem.” by Will Oremus (June 5, 2023) – The cost of operating the systems is so high that companies aren’t deploying their best versions to the public.
- Chatbots lose money on every chat.
- Better chatbot quality costs more. So, ads are probably coming to AI chatbots (but profitability will remain elusive, even with smaller, cheaper models).
- The world’s richest [tech] companies may turn chatbots into moneymakers sooner than they may be ready to.
- Companies that buy … AI tools [from companies building the leading AI language models] don’t realize they’re being locked into a heavily subsidized service …
- The intensive computing AI requires is why OpenAI has held back its powerful new language model, GPT-4, from the free version of ChatGPT, which is still running a weaker GPT-3.5 model.
- A single chat with ChatGPT could cost up to 1,000 times as much as a simple Google search.
- Computing requirements also help to explain why OpenAI is no longer the nonprofit it was founded to be.
- Tech giants are willing to lose money in a bid to win market share with their AI chatbots.
- Companies adopting generative AI tools (even with all their flaws) might trim human jobs.
•  Lords of AI – Tech giants and an International Agency > Comment 5/15/2023 > This article discusses a forecast for the industrial cost of AI services – a massive increase, despite ongoing improvements in hardware performance  – “As demand for GenAI continues exponentially.”