Cerebras
The wafer-scale chip company pricing its IPO at $23–35B after an OpenAI $20B+ deal.
cerebras.net ↗Cerebras is the only credible commercial alternative to NVIDIA at the scale where buyers (like OpenAI) are willing to write 10-figure checks for inference capacity. The WSE-3 chip architecture is genuinely different — not a small GPU, a wafer-sized single die — and it wins specifically on inference latency and per-query throughput, which matters most in the agentic-AI workload that's driving compute demand in 2026. The IPO, whether it prices at $23B or $35B, will be the first time public-market investors get clean exposure to non-NVIDIA AI silicon at scale. That's a category the market doesn't currently have.
What's going for them.
- 01Cerebras WSE-3 — 4 trillion transistors, 900,000 cores, 58× larger than NVIDIA's B200 — is the most architecturally distinct AI accelerator shipping at commercial scale. Claimed 21× performance over NVIDIA DGX B200 at 1/3 the cost and power on inference workloads.
- 02OpenAI's $20B+ Master Relationship Agreement — 750 MW of inference capacity expandable to 2 GW — is the single largest chip-procurement commitment any AI company has made outside of NVIDIA, and it's going to Cerebras specifically because of latency-and-throughput advantages that GPUs can't match.
- 03Profitable at scale: $510M revenue in 2025 with $87.9M net income, 76% YoY growth. Most AI infrastructure companies are operating at heavy losses; Cerebras has built a genuinely profitable hardware business.
- 04S-1 filed April 17, 2026 for Nasdaq IPO targeting mid-May completion, raising ~$3B at a $35B valuation (alternate reporting: $23B on $2B raise with Morgan Stanley lead). Second IPO attempt after scrapping 2024 effort; this one is structured to close.
- 05G42 (Abu Dhabi sovereign-AI entity) strategic partnership makes Cerebras a core component of Middle East AI infrastructure — the non-NVIDIA alternative for sovereign compute buildouts that aren't constrained by US export policy in the same way.
What they built
Cerebras builds and operates wafer-scale AI compute systems. The flagship product — the WSE-3 (Wafer Scale Engine, generation 3) — is an AI accelerator that takes up an entire silicon wafer: 4 trillion transistors, 900,000 cores, and 2,625× the memory bandwidth of NVIDIA’s B200. The business runs in two modes: selling CS-3 systems (the wafer built into a usable appliance) and operating Cerebras Cloud (managed inference on Cerebras hardware, including for OpenAI’s ChatGPT-scale workloads). Customers include frontier AI labs, national-lab supercomputing programs, pharma companies running biological simulation, and sovereign customers in the Middle East.
How they got here
Andrew Feldman and four co-founders launched Cerebras in 2016 with a single architectural bet: the industry’s assumption that AI chips had to fit standard manufacturing yields (i.e., small dies, grouped into boards) was wrong, and a single-wafer design would unlock dramatically better performance for AI workloads. The first decade was the slow accumulation of proof points — WSE-1 in 2019, WSE-2 in 2021, WSE-3 in 2024, major customer wins at Mayo Clinic, GlaxoSmithKline, the Department of Energy. The 2024 IPO attempt was withdrawn amid market conditions and an unresolved CFIUS review of G42’s investment stake.
2025 was the inflection year. G42 (Abu Dhabi) committed to becoming a major Cerebras customer. Revenue hit $510M (+76% YoY) with $87.9M net income. The OpenAI deal — a $20B+ Master Relationship Agreement for 750 MW of inference capacity, expandable to 2 GW — was the defining commercial validation, establishing Cerebras as OpenAI’s primary non-NVIDIA infrastructure partner. The S-1 filing in April 2026 targeting a $23–35B valuation reopens the IPO path with dramatically stronger underlying metrics.
What’s ahead
Three things define Cerebras’s trajectory through 2026 and beyond. First, the IPO: pricing above the $23B floor requires investors to value Cerebras as a structural NVIDIA alternative, not just a specialty chip company. The OpenAI deal and profitable 2025 financials support that framing. Second, capacity buildout: delivering on the 750 MW → 2 GW OpenAI commitment requires massive manufacturing and data-center investment, and execution risk on a hardware ramp of this scale is material. Third, product cadence: WSE-4 and beyond need to maintain the architectural lead over NVIDIA’s Blackwell, Rubin, and subsequent generations — a continuous R&D race where NVIDIA has more capital.
Why it matters
Cerebras is the one commercially-meaningful non-NVIDIA AI accelerator company, and its IPO will be the first real public-market pricing of that thesis. For AI hardware founders, Cerebras is the proof that patient architectural innovation can break into a market everyone assumed was locked up. For investors, Cerebras’s IPO is one of the two or three most important tech listings of 2026 — a rare chance to buy AI infrastructure exposure without taking NVIDIA-sized single-company concentration risk.
Founder interview coming soon.
We'll be sitting down with the founders and operators of the companies we profile — on fundraising, product decisions, and what they're building next. If you're part of the Cerebras team and want to share a perspective, get in touch.
Thinking about fundraising or M&A?
Amafi Advisory works with AI companies on strategic, fundraising, M&A, and technical advisory. Even if you're just exploring.