28.8 C
Hong Kong
Friday, October 4, 2024

Your AD here

OpenAI’s ChatGPT Costs Up to $700,000 a Day to Run, Microsoft May Develop AI Chips to Assist

OpenAI’s ChatGPT Costs a Staggering $700,000 Daily, Microsoft Could Develop AI Chips to Help

According to a report by Dylan Patel, chief analyst at research firm SemiAnalysis, OpenAI’s ChatGPT costs approximately $700,000 per day or 36 cents per query to run. The popular AI language model requires massive amounts of computing power to calculate responses based on user prompts. The immense cost has prompted speculation that Microsoft, one of OpenAI’s primary collaborators and investors, may be developing proprietary AI chips to assist in maintaining ChatGPT’s operation.

High Traffic and Capacity Issues

ChatGPT reached 100 million active users in January, a milestone that used to take other tech brands years to achieve. However, this rapid growth led to high traffic and server capacity issues, slowing down the service and even causing crashes. To address these challenges, OpenAI introduced a $20-per-month ChatGPT Plus tier. The company currently uses Nvidia GPUs for its processes and is expected to need an additional 30,000 GPUs for 2023 alone to maintain its commercial performance.

High Operational Costs for Companies Using OpenAI’s Language Models

Companies utilizing OpenAI’s language models have been paying steep prices for years. For instance, Latitude, a startup behind an AI dungeon game that uses prompts to generate storylines, reportedly spent $200,000 a month in 2021 to run the model and maintain payments to Amazon Web Services servers. The high cost prompted Latitude’s CEO, Nick Walton, to switch to a language software provider backed by AI21 Labs, which cut his company’s AI costs in half to $100,000 a month.

Microsoft’s Potential Role in Reducing ChatGPT’s Operational Costs

As a primary collaborator and investor, Microsoft may develop hardware to reduce ChatGPT’s operational costs. The tech giant is reportedly working on an AI chip called Athena, which is currently being tested internally. The chip, slated for introduction in Microsoft’s Azure AI services next year, could potentially help decrease ChatGPT’s reliance on Nvidia GPUs and lower costs. While it’s unclear when the chip will be implemented for OpenAI and ChatGPT, the connection between the two companies suggests it’s a likely possibility.

Impact on the AI Industry

If Microsoft successfully develops and implements AI chips to assist with ChatGPT’s operational costs, it could have a significant impact on the AI industry. By reducing the reliance on expensive GPUs and lowering operational costs, more companies may be encouraged to invest in AI technology and services. This could lead to increased innovation and competition in the market, ultimately benefiting both businesses and consumers.

Source

Related Articles

Latest Articles