July 24th, 2024

OpenAI is set to lose $5B this year

OpenAI's projected costs for 2024 are $7 billion, with a potential $5 billion loss. Revenue from ChatGPT is about $2 billion annually, indicating a significant financial shortfall.

Read original articleLink Icon
OpenAI is set to lose $5B this year

OpenAI's training and inference costs are projected to reach $7 billion in 2024, with the company potentially facing a $5 billion loss. As of March, OpenAI was expected to spend nearly $4 billion on Microsoft Azure servers for running inference workloads for ChatGPT. The company operates approximately 350,000 servers equipped with Nvidia A100 chips, with around 290,000 dedicated to ChatGPT, running at near full capacity. Training costs for ChatGPT and new models could add another $3 billion this year. OpenAI benefits from discounted rates from Microsoft, paying about $1.30 per A100 server per hour. The workforce, now around 1,500 employees, is estimated to cost $1.5 billion, significantly higher than the initial projection of $500 million for 2023. OpenAI generates about $2 billion annually from ChatGPT and anticipates nearly $1 billion from access to large language models (LLMs). Recent revenue figures indicate a monthly total of $283 million, suggesting potential full-year sales between $3.5 billion and $4.5 billion. This financial outlook indicates a substantial shortfall, prompting the need for additional funding within the next year to cover operational costs and losses.

Related

AI industry needs to earn $600B per year to pay for hardware spend

AI industry needs to earn $600B per year to pay for hardware spend

The AI industry faces challenges to meet $600 billion annual revenue target to offset massive hardware investments. Concerns arise over profitability gap, urging companies to innovate and create value for sustainability.

AI models that cost $1B to train are underway, $100B models coming

AI models that cost $1B to train are underway, $100B models coming

AI training costs are rising exponentially, with models now reaching $1 billion to train. Companies are developing more powerful hardware to meet demand, but concerns about societal impact persist.

OpenAI slashes the cost of using its AI with a "mini" model

OpenAI slashes the cost of using its AI with a "mini" model

OpenAI launches GPT-4o mini, a cheaper model enhancing AI accessibility. Meta to release Llama 3. Market sees a mix of small and large models for cost-effective AI solutions.

AI paid for by Ads – the GPT-4o mini inflection point

AI paid for by Ads – the GPT-4o mini inflection point

OpenAI released the gpt-4o mini model at $0.15 per 1M input tokens and $0.60 per 1M output tokens, enabling cost-effective AI content creation. Despite low costs, profitability per page view remains minimal. Future AI-generated blogs prompt discussions on the internet's evolution.

XAI's Memphis Supercluster has gone live, with up to 100,000 Nvidia H100 GPUs

XAI's Memphis Supercluster has gone live, with up to 100,000 Nvidia H100 GPUs

Elon Musk launches xAI's Memphis Supercluster with 100,000 Nvidia H100 GPUs for AI training, aiming for advancements by December. Online status unclear, SemiAnalysis estimates 32,000 GPUs operational. Plans for 150MW data center expansion pending utility agreements. xAI partners with Dell and Supermicro, targeting full operation by fall 2025. Musk's humorous launch time noted.

Link Icon 8 comments
By @JCM9 - 9 months
The business model for GenAI companies is still very unclear. The tech is super interesting and can be quite useful, but this space has turned into an arms race that’s led to mass commoditization of said arms. The tech world benefits from the very interesting advancements of the last few years, but commercially this all seems to be setting up a big train wreck.
By @jsheard - 9 months
If it's true that Apple isn't going to pay OpenAI a dime for the Siri queries they hand off to ChatGPT then things are surely only going to get worse in the short term. That's potentially a huge strain on their systems which they will have to subsidise out of pocket.

https://fortune.com/2024/06/13/apple-not-paying-openai-chatg...

By @xnx - 9 months
More discussion here: 7 hours ago. 70 comments. https://news.ycombinator.com/item?id=41058616
By @ChrisArchitect - 9 months
By @ffhhj - 9 months
Are you telling me the alt man crying in front of Congress didn't make them build that moat?
By @Legend2440 - 9 months
This article is just a (free) rephrasing of the (paywalled) article by theinformation.

This is exactly what the news industry is super concerned that AI will do, except done by humans and the news industry themselves.

By @andrewstuart - 9 months
5000 employees paid $1,000,000 each.