October 31st, 2024

Sam Altman says lack of compute is delaying the company's products

OpenAI CEO Sam Altman stated that product development is hindered by limited compute capacity, delaying features for ChatGPT and DALL-E, while collaborating with Broadcom on an AI chip expected by 2026.

Read original articleLink Icon
Sam Altman says lack of compute is delaying the company's products

OpenAI CEO Sam Altman has acknowledged that the company's product development is being hindered by a lack of compute capacity. In a recent Reddit AMA, he explained that the complexity of their models and the need to make difficult decisions regarding compute allocation are significant challenges. Reports indicate that OpenAI has struggled to secure sufficient infrastructure for training its generative models. The company is collaborating with Broadcom to develop an AI chip, which may be available by 2026. As a result of these limitations, features like the vision capabilities for ChatGPT's Advanced Voice Mode have been delayed. Altman also mentioned that there is no set timeline for the next release of the DALL-E image generator, and the video-generating tool Sora is facing technical setbacks. Additionally, OpenAI is contemplating the inclusion of NSFW content in ChatGPT in the future. Altman emphasized that improving the reasoning models remains a top priority, with several new features expected to be announced soon, although he clarified that there will be no release labeled as GPT-5.

- OpenAI's product delays are primarily due to insufficient compute capacity.

- The company is working with Broadcom on an AI chip expected by 2026.

- Advanced features for ChatGPT and DALL-E are facing delays due to technical challenges.

- OpenAI is considering allowing NSFW content in ChatGPT in the future.

- The focus remains on enhancing reasoning models and upcoming features.

Link Icon 6 comments
By @kibwen - 6 months
"Elizabeth Holmes says lack of working blood tests is delaying the company's products."

If you're spending (and losing) tens of billions a year and still can't afford a fundamental input to your product, then that's a sign that your company might not be viable given the present state of technology.

By @fxtentacle - 6 months
My reading on this is that they refuse to allow public access to newer models because inference is much more expensive than the $20 per month that the market is willing to pay. That's why they are working on dedicated inference hardware. And that's why they are still losing money: they are selling access to their AI for less than what it costs to operate. Releasing more resource intensive products (without raising prices) would then just increase their losses. And investor money being limited then forces them to not release new products.
By @_fat_santa - 6 months
What a convenient and self flattering excuse.
By @JSDevOps - 6 months
They literally have access to anything they want all they need to do is pick up the phone and people would be tripping over themselfs to provide compute services … for a price or course.
By @bokohut - 6 months
And if he owned all that compute at this very moment his statement would instead read "Sam Altman says lack of grid power generation is delaying the company's products"