AI chipmaker Cerebras files for IPO to take on Nvidia
Cerebras Systems has filed for an IPO under the ticker "CBRS" on Nasdaq, reporting a $66.6 million net loss in H1 2024, while competing with Nvidia and others in AI chips.
Read original articleArtificial intelligence chipmaker Cerebras Systems has filed for an initial public offering (IPO) and plans to trade under the ticker symbol "CBRS" on the Nasdaq. The company reported a net loss of $66.6 million in the first half of 2024, with sales of $136.4 million, compared to a net loss of $77.8 million and $8.7 million in sales during the same period in 2023. Cerebras, which competes with Nvidia and other tech giants like AMD and Intel, offers its WSE-3 chip, which boasts more cores and memory than Nvidia's H100. The company also provides cloud-based services utilizing its computing clusters. In 2023, Cerebras had a total net loss of $127.2 million on revenues of $78.7 million. A significant portion of its revenue comes from Group 42, a UAE-based AI firm. The IPO comes amid a challenging technology market, with higher interest rates affecting investor sentiment. Citigroup and Barclays are leading the offering, while major investors include Foundation Capital and notable individuals like OpenAI CEO Sam Altman. Cerebras was founded in 2016 and has previously been valued at over $4 billion.
- Cerebras Systems has filed for an IPO, aiming to trade as "CBRS" on Nasdaq.
- The company reported a net loss of $66.6 million in the first half of 2024.
- Cerebras competes with Nvidia and other major tech firms in the AI chip market.
- A significant portion of its revenue comes from a partnership with UAE-based Group 42.
- Citigroup and Barclays are leading the IPO offering.
Related
UK chip giant ARM developing GPU in Israel
UK chip company ARM is developing a GPU in Israel to compete with Nvidia and Intel, focusing on licensing technology and collaborating with startups to enhance AI capabilities while exploring new growth avenues.
Cerebras Inference: AI at Instant Speed
Cerebras launched its AI inference solution, claiming to process 1,800 tokens per second, outperforming NVIDIA by 20 times, with competitive pricing and plans for future model support.
Cerebras reaches 1800 tokens/s for 8B Llama3.1
Cerebras Systems is deploying Meta's LLaMA 3.1 model on its wafer-scale chip, achieving faster processing speeds and lower costs, while aiming to simplify developer integration through an API.
Cerebras Launches the Fastest AI Inference
Cerebras Systems launched Cerebras Inference, the fastest AI inference solution, outperforming NVIDIA GPUs by 20 times, processing up to 1,800 tokens per second, with significant cost advantages and multiple service tiers.
The thrill of AI is fading – and Wall Street is getting clear-eyed about value
Nvidia reported a 122% sales increase and doubled profits, but its stock fell 7% as investors reassess AI's revenue potential amid growing competition and waning excitement around the technology.
- Many commenters question the viability of Cerebras competing against established players like Nvidia, Intel, and AMD, especially given its significant net loss.
- There are discussions about the technical advantages of Cerebras' wafer-scale chips, but skepticism remains about their performance and market adoption.
- Concerns are raised about the timing of the IPO, with some suggesting that the company's financials do not justify going public.
- Commenters highlight the need for Cerebras to improve its software and developer relations to succeed.
- Some express doubt about the company's long-term prospects, citing a lack of a competitive moat and the challenges of wafer-scale integration.
On the other, Nvidia is worth 3trn so they can sell a pretty good dream of what success looks like to investors.
Personally I would expect them to get a valuation well about the 4bln from the 2021 round, despite the financials not coming close to justifying it.
"The documents also say that a single customer, G42, accounted for 83% of revenue in 2023 and 87% in the first half of 2024."
https://www.eetimes.com/cerebras-ipo-paperwork-sheds-light-o...
Their idea is to have 44 GB SRAM per chip. SRAM is _very_expensive_ compared to DRAM (about two orders of magnitude).
It's easy to design larger chip. What determines the price/performance ratio are things like
- performance per chip area.
- yield per chip area.
I have never heard of any models trained on this hardware. How does a company IPO on the basis of having the "best tech" in this industry, when all the top models are trained on other hardware.
It just doesn't add up.
From the article
>Cerebras had a net loss of $66.6 million in the first six months of 2024 on $136.4 million in sales, according to the filing.
That doesn't sound very good.
What makes them think they can compete with Nvidia, and why IPO right now?
Are they trying to get government money to make chip fabs like Intel or something?
Related
UK chip giant ARM developing GPU in Israel
UK chip company ARM is developing a GPU in Israel to compete with Nvidia and Intel, focusing on licensing technology and collaborating with startups to enhance AI capabilities while exploring new growth avenues.
Cerebras Inference: AI at Instant Speed
Cerebras launched its AI inference solution, claiming to process 1,800 tokens per second, outperforming NVIDIA by 20 times, with competitive pricing and plans for future model support.
Cerebras reaches 1800 tokens/s for 8B Llama3.1
Cerebras Systems is deploying Meta's LLaMA 3.1 model on its wafer-scale chip, achieving faster processing speeds and lower costs, while aiming to simplify developer integration through an API.
Cerebras Launches the Fastest AI Inference
Cerebras Systems launched Cerebras Inference, the fastest AI inference solution, outperforming NVIDIA GPUs by 20 times, processing up to 1,800 tokens per second, with significant cost advantages and multiple service tiers.
The thrill of AI is fading – and Wall Street is getting clear-eyed about value
Nvidia reported a 122% sales increase and doubled profits, but its stock fell 7% as investors reassess AI's revenue potential amid growing competition and waning excitement around the technology.