Key Takeaways
- Cerebras (CBRS) shares dropped approximately 10% during its second trading session after an impressive 68% jump on Thursday’s IPO debut, which priced shares at $185 and ended at $311.07.
- The AI chipmaker secured $5.55 billion through its initial public offering, issuing 30 million shares priced at $185 apiece with an opening price of $350.
- Trading at almost 200x trailing price-to-sales based on $510 million in 2025 revenue, CBRS’s valuation exceeds Nvidia’s multiple by more than 7 times.
- The company’s $24.6 billion order backlog shows significant concentration risk, with $20 billion stemming from a single OpenAI cloud agreement that features exclusivity terms and a termination clause for performance delays.
- D.A. Davidson analyst Gil Luria estimates the company’s fair value at approximately $115 per share — roughly $25 billion — aligning closely with its backlog valuation.
Cerebras Systems (CBRS) shares tumbled roughly 10% on Friday, just 24 hours after one of the most remarkable AI IPO launches in recent history, as market participants began scrutinizing the company’s extreme valuation alongside concerns regarding its technological limitations and customer diversity.
Shares of CBRS were offered at $185 each on Thursday, launched trading at $350, and momentarily touched $385 before experiencing a trading halt. The debut session concluded at $311.07 — representing a 68% surge. By Friday’s opening bell, shares had retreated to approximately $279.99.
The public offering generated $5.55 billion in capital, marking one of the most substantial AI-sector IPOs in recent years. Calculating on a fully diluted basis, the company’s market capitalization exceeded $100 billion.
That represents an enormous valuation for an enterprise generating $510 million in 2025 revenue.
At present trading levels, CBRS commands a trailing price-to-sales multiple approaching 200x. For context, Nvidia trades at approximately 27x. Even when factoring in Cerebras’ anticipated growth rates, this valuation disparity raises eyebrows.
The Technology Debate
Cerebras manufactures exceptionally large AI chips. Its primary product, the Wafer-Scale Engine 3 (WSE-3), measures 58 times larger than leading GPU alternatives according to company specifications and provides inference performance up to 15 times faster than Nvidia-powered systems.
The performance benefits are demonstrable. However, industry observers highlight certain compromises.
Gil Luria from D.A. Davidson observed that the chip’s unconventional dimensions have thus far restricted its use to smaller, less sophisticated AI models. Additionally, the unique size presents production complexities related to manufacturing yield — the company hasn’t yet proven its ability to produce sufficient quantities of functional chips for large-scale implementations.
“While the system may provide superior speed for certain use cases, it also offers less versatility compared to existing AI compute deployments,” Luria explained.
CEO Andrew Feldman responded to these concerns, informing Barron’s that Cerebras is currently supporting larger models in private deployments and plans to demonstrate this capability publicly in the coming weeks.
The Customer Concentration Issue
Cerebras disclosed a $24.6 billion order backlog as of year-end 2024. Management anticipates converting approximately $3.7 billion of this backlog into recognized revenue during 2026 and 2027.
However, there’s a significant caveat. Roughly $20 billion of the total backlog — exceeding 80% — originates from a solitary cloud services agreement with OpenAI.
This arrangement contains exclusivity provisions that may restrict Cerebras’ capacity to engage with competing frontier AI laboratories. The contract also incorporates termination rights tied to delivery delays, potentially reducing the backlog substantially.
Luria assigned an approximate valuation of $25 billion to the enterprise — translating to roughly $115 per share — derived from the backlog assessment.
The company does maintain relationships with prominent AI industry players, including OpenAI, Amazon Web Services, Meta, and IBM.
Revenue expanded 96% year-over-year to $171 million in the latest reporting period. By comparison, Nvidia’s data center segment generated $62.13 billion in its most recent quarter, representing 75% growth.





