August 4th, 2024

Nvidia reportedly delays its next AI chip due to a design flaw

Nvidia has delayed production of its "Blackwell" B200 AI chips due to a design flaw, pushing shipments to early next year. Major clients have placed substantial orders, totaling billions.

Read original articleLink Icon
ConcernSkepticismAnticipation
Nvidia reportedly delays its next AI chip due to a design flaw

Nvidia has announced a delay in the production of its upcoming "Blackwell" B200 AI chips due to a design flaw identified late in the production process. This delay is expected to extend the timeline for large-scale shipments by at least three months, pushing the availability of the chips into the first quarter of next year. The B200 chips are intended to follow the highly sought-after H100 chips, which have significantly contributed to Nvidia's market value. Reports indicate that major clients, including Microsoft, Google, and Meta, have placed substantial orders for these chips, amounting to tens of billions of dollars. Nvidia is currently conducting additional test runs with Taiwan Semiconductor Manufacturing Company to address the design issues. The company had previously stated that Blackwell-based products would be available from partners starting in 2024, and it aims to establish a yearly release cycle for new AI chips. Despite the setback, Nvidia remains optimistic about ramping up production in the second half of the year.

AI: What people are saying
The comments reflect a mix of concerns and speculations regarding Nvidia's delayed AI chip production.
  • Some commenters express hope for innovation from smaller systems rather than large clusters.
  • There are questions about the implications of design flaws, with some suggesting they could be a cover for market dynamics or competition issues.
  • Concerns about export restrictions and the potential impact on Nvidia's market strategy are raised.
  • Several users speculate on the future of AI technology and the demand for Nvidia chips, particularly in relation to competitors.
  • Some comments hint at a broader trend of delays and defects in the tech industry, suggesting a potential "AI bust."
Link Icon 17 comments
By @TheAlchemist - 9 months
I like to think (hope) that the next breakthrough will come not from these huge clusters, but from somebody tinkering with new ideas on a small local system.

I also wonder - is the compute the main limiting factor today ? Let's imagine there is an unlimited number of NVidia chips available right now and energy is cheap - would using a cluster x100 of current biggest one result in a significant improvement ? My naive intuition is that not really.

By @jsemrau - 9 months
Just make smaller chips with more VRAM. Then consumer PCs could run local models much easier on top of their server GPU market.
By @breadwinner - 9 months
The next big thing that will drive demand for Nvidia chips is AI Search (https://openai.com/index/searchgpt-prototype/). To avoid being obsoleted Google and Microsoft Bing have to spend big on Nvidia hardware, and when Nvidia releases newer chips with lower power consumption becomes available OpenAI, Google and Microsoft will be forced to lap it up.
By @nextworddev - 9 months
The real risk is that... 3 month delay turns into 6 months easily. That said there will be some relief rally at some point into Fed cut hopes
By @2OEH8eoCRo0 - 9 months
I'm interested in how they handle export restrictions. The RTX 4090 is already banned for export to China (Beijing) but what happens when the mid-range is banned for export because their performance is above the 4090?

Gina is going to be busy.

By @linotype - 9 months
Does anyone know when the next GPU (RTX) is expected? Seems like it’s also delayed.
By @seydor - 9 months
Is the design flaw that the previous ones are selling too well?
By @lowbloodsugar - 9 months
Translation: “There’s no competition. You sops can keep paying ridiculous amounts for our old tech, and we’ll just sit on this new one until it’s needed to compete.”
By @Avisan - 8 months
Could this impact AI development timelines for major tech players?
By @bravetraveler - 9 months
Ah, yes, competition in the market. One company - Intel - drops the ball so badly two others can relax, releasing more of the same
By @meroes - 9 months
Cover for AI winter?
By @paulproteus - 9 months
What if the design flaw makes AIs unsafe? This would be a fun sci-fi piece.
By @xyst - 9 months
Intel chips are defective. AMD has a delay due to late game QA concerns (or playing release games). Now NVDA “ai chips” are faulty.

AI bust is coming

By @bzmrgonz - 9 months
I think they're calling the bug or design flaw "intel sabo".
By @notarealllama - 9 months
I am looking forward to consumer / producer grade Tensor Processing Units and holding off on a desktop / server purchase until we maybe see something like this.

With inference tasks it'd be nice to have something not as performance heavy and instead like 24gb vram or more.

I know there are tricks in CUDA to allow access to system ram as a proxy but limited success in duplicating this across various setups.

By @bottlepalm - 9 months
Given how spooky AI is getting I'm kind of relieved. Hopefully there really is a flaw, if not then extra spooky.

AI-Made Bioweapons Are Washington’s Latest Security Obsession (https://archive.ph/oROPO)