According to an analysis by The Information, based on internal financial data, OpenAI’s revenue this year is expected to reach $1.3 billion.LossIt could be as high as $5 billion.AnthropicIt also faces significant losses of billions of dollars.
The Information reported that OpenAI's cost of training AI models and running reasoning systems could reach $7 billion, especially after Apple introduced ChatGPT integration, reasoning costs are expected to rise further. In addition, personnel costs could be as high as $1.5 billion.
Source Note: The image is generated by AI, and the image is authorized by Midjourney
OpenAI spent nearly $4 billion just renting Microsoft servers, even though it received a discount on computing power ($1.30 per Nvidia A100 chip per hour). This supports the view that Microsoft's interest in AI investments is mainly related to the growth of its Azure cloud platform. In contrast, Microsoft's own AI products (such as Copilot or Bing integration) have not performed as well.
OpenAI's AI training costs, including data payments, could rise to $3 billion this year. The company currently employs about 1,500 people and plans to expand further. According to The Information, personnel expenses could reach $1.5 billion by the end of the year. OpenAI's operating costs this year could reach $8.5 billion, with revenue of $3.5 billion to $4.5 billion, depending on sales in the second half of the year.
Anthropic is in worse shape, despite being smaller. Sources familiar with the numbers say Anthropic expects to spend more than $2.7 billion this year, while generating revenues one-fifth to one-tenth of OpenAI’s. The startup’s computing costs alone are estimated at $2.5 billion.
By the end of the year, Anthropic expects to have annualized revenue of about $800 million, or $67 million per month. However, Anthropic must share this revenue with Amazon.
With Meta’s involvement in open-source models, and the rise of smaller companies like Mistral and Cohere in Europe or certain market segments (such as B2B data chat), the high development and operating costs of AI models face fierce competition. Enterprises have difficulty measuring the value of generative AI in their processes when implementing chatbot systems as “general purpose technology”, especially when there is no clear use case for all employees, such as Microsoft’s Copilot or OpenAI’s ChatGPT Enterprise Edition.
Initial doubts are beginning to emerge about the economic viability of the current AI market. This does not negate the overall value of generative AI, but questions whether the investment is proportional to the benefits.
Potential growth areas include OpenAI's new product SearchGPT, but replicating ChatGPT's success is uncertain. Competing products such as Google's Gemini subscription service have failed to make a significant impact. ChatGPT may be a standout.
More versatile multimodal models could create new use cases, leading to new applications, increased usage, and higher revenues. If efficiency is improved at the same time, profit margins could eventually improve. However, many questions remain about the ultimate quality of these features and the cost of generating multimodal content such as video.
To reach the next level, the AI market will likely need major breakthroughs in scaling general-purpose reasoning capabilities. This would open up new automation and business opportunities and potentially solve fundamental problems with current AI systems, such as generating nonsense.
For OpenAI CEO Sam Altman and others, this may be the ultimate bet, which explains why major companies continue to invest billions in research and development. As Google CEO Sundar Pichai said on a recent earnings call: "Here, our risk of underinvesting is far greater than our risk of overinvesting."