Cerebras Files for $23B IPO Backed by a $20B OpenAI Deal

📖 5 min read

The biggest AI chip IPO in years just became real. On April 17, 2026, Cerebras Systems filed its S-1 registration statement with the SEC, targeting a mid-May debut on the Nasdaq Global Select Market under the ticker CBRS. The company is seeking a valuation of $22 billion to $25 billion – and it already comes with something most IPOs dream about: a $20 billion customer contract signed and waiting.

That customer is OpenAI. The ChatGPT maker agreed to purchase more than $20 billion worth of AI compute from Cerebras – specifically, 750 megawatts of inference capacity through 2028, with options to expand to an additional 1.25 gigawatts by 2030. OpenAI was so committed to the deal that it also lent Cerebras $1 billion at 6% interest just to help build the data centers needed to deliver on the contract. Amazon Web Services added a separate $1 billion partnership on top of that.

This is not hype. The numbers behind Cerebras are real and they are moving fast.

The Numbers That Matter

Metric 2024 2025 Change
Revenue ~$290M $510M +76% YoY
Net Income / (Loss) -$485M +$87.9M Swing to profit
Remaining Performance Obligations $24.6B Backlog locked in
Target IPO Valuation $22B – $25B ~46x 2025 revenue

Revenue jumped 76% year over year to $510 million. More strikingly, the company swung from a $485 million net loss in 2024 to an $87.9 million net profit in 2025. And its remaining performance obligations – the value of contracts already signed but not yet recognized as revenue – stand at $24.6 billion. In plain English, Cerebras has more revenue already contracted than it has earned in its entire history.

📧 Want more like this? Get our free The 2026 AI Playbook: 50 Ways AI is Making People Rich — Free for a limited time - going behind a paywall soon

What Makes Cerebras Different from NVIDIA

Cerebras builds AI chips using a fundamentally different architecture called wafer-scale engine (WSE) technology. Instead of cutting a silicon wafer into dozens of small chips, Cerebras uses the entire wafer as a single chip. The result is a processor that is nearly 30 times the physical size of NVIDIA’s Blackwell B200 and packs in 19 times as many transistors per chip. Compare it to the older H100 and the size difference balloons to roughly 57 times.

The practical advantage is speed and efficiency. Wafer-scale chips move data internally without needing the networking hardware that large GPU clusters require. For AI inference workloads – running a finished model rather than training a new one – Cerebras claims this translates to dramatically faster response times and lower power draw per request.

That inference advantage is exactly what OpenAI needs. As ChatGPT handles hundreds of millions of queries per day, shaving milliseconds off each response and cutting power costs per token directly impacts OpenAI’s margins.

Join 2,400+ readers getting weekly AI insights

Free strategies, tool reviews, and money-making playbooks - straight to your inbox.

No spam. Unsubscribe anytime.

The Risks Nobody Should Ignore

The Cerebras story has real weaknesses, and investors should look at them clearly before the IPO hype machine runs full speed.

Customer concentration risk is extreme. OpenAI is not just a big customer – it is effectively the entire business. The $20 billion OpenAI deal and the $24.6 billion in remaining obligations are so dominated by one counterparty that if OpenAI changes strategy, delays deployments, or develops competing in-house chips (which it is reportedly pursuing), Cerebras revenue could crater overnight. No amount of wafer-scale technology changes what a single-customer dependency looks like on a balance sheet.

NVIDIA is not sleeping. The Vera Rubin platform – NVIDIA’s next generation after Blackwell – is already in advanced development and targets many of the same inference workloads where Cerebras competes. NVIDIA has a manufacturing ecosystem, software stack (CUDA), and customer relationships that Cerebras cannot match. Being technically impressive is not the same as being competitively safe.

The valuation asks a lot. At $22 billion to $25 billion, Cerebras is pricing at roughly 46 times its 2025 revenue. The business generated $87.9 million in net income – meaning the valuation represents roughly 280 times earnings. That is a rich multiple even in a hot AI market, and it assumes the $24.6 billion backlog converts to revenue without meaningful slippage.

Manufacturing is not simple to scale. Wafer-scale chips are harder to produce than standard chips because any defect anywhere on the wafer is more likely to affect the entire chip. Yield rates and supply chain constraints at that scale remain a material risk that the S-1 acknowledges but cannot fully quantify.

How It Compares to the Arm IPO

Arm (2023) Cerebras (2026 target)
IPO Valuation ~$54B $22B – $25B
Revenue at IPO ~$2.7B (FY2023) $510M (2025)
Revenue Multiple ~20x ~46x
Customer Concentration Diversified Heavy (OpenAI)
Post-IPO Performance (1yr) +100%+ Unknown

Arm’s 2023 IPO debuted at $54 billion and was considered richly priced at the time. It went on to more than double over the following year. Cerebras is pricing at a fraction of Arm’s scale but at a higher revenue multiple, with far more customer concentration. The comparison is not a blueprint – it is a reminder that AI hardware plays can move unpredictably in both directions after listing.

What to Do With This Information

If you are a retail investor watching the CBRS IPO: the business fundamentals are genuinely strong. The swing from a nearly half-billion-dollar loss to profitability in a single year is not an accounting trick – it reflects a real contract converting to real revenue. The $24.6 billion backlog is locked in on paper.

But the risks are equally real. Buying at the IPO price on day one means paying a premium before the lock-up period expires and before it becomes clear whether OpenAI continues to scale its Cerebras deployment or starts routing workloads elsewhere. Historically, waiting 90 to 180 days after a high-profile IPO has resulted in better entry prices more often than not – the Arm IPO was a notable exception, not the rule.

If you work in AI or enterprise tech: the Cerebras architecture is worth understanding regardless of the stock. If wafer-scale inference proves out at the scale OpenAI is deploying, it represents a real alternative to stacking GPU clusters for certain workloads – one that could eventually reshape how companies budget for AI inference costs.

BetOnAI Verdict

Story: Real. Risks: Also Real. Timing: Watch carefully.

Cerebras is not a paper tiger. The $20 billion OpenAI deal, the 76% revenue growth, and the profitability turnaround are legitimate data points. The wafer-scale chip technology solves a real problem in AI inference. This is the kind of company that could look obvious in hindsight – or could be a cautionary tale about customer concentration and NVIDIA’s competitive moat.

At a $22-25 billion valuation with $510 million in 2025 revenue and one customer representing the vast majority of that backlog, the price already reflects optimism. BetOnAI rates the technology as credible, the business model as fragile until customer diversification improves, and the IPO valuation as rich. The mid-May listing will be worth watching closely – but buying on day one requires a high tolerance for headline risk if OpenAI makes any changes to its compute strategy.

Watch for the final S-1 pricing and lock-up expiry dates before making a decision. The business could earn a better entry point after the initial hype settles.


Sources:

Enjoyed this? There's more where that came from.

Get the AI Playbook - 50 ways AI is making people money in 2026.
Free for a limited time.

Join 2,400+ subscribers. No spam ever.

🔥 FREE: AI Playbook — Explore our guides →

Get the AI Playbook That is Making People Money

7 chapters of exact prompts, pricing templates and step-by-step blueprints. This playbook goes behind a paywall soon - grab it while its free.

No thanks, I hate free stuff
𝕏0 R0 in0 🔗0
Scroll to Top