Most AI companies run their models on Nvidia GPUs. Cerebras took a fundamentally different path — they built a chip the size of a dinner plate. Not a piece cut from a wafer. The entire wafer. And now they’re filing to go public on Nasdaq under the ticker CBRS.
Here’s everything worth knowing.
The Origin Story
Cerebras was founded in 2016 by five people who had all worked together before. Andrew Feldman (CEO), Gary Lauterbach (CTO, now retired), Michael James (Chief Software Architect), Sean Lie (Chief Hardware Architect and current CTO), and Jean-Philippe Fricker (Chief System Architect) — all former colleagues from SeaMicro, the server startup Feldman and Lauterbach built and sold to AMD in 2012 for around $334 million.
This wasn’t a team that needed to prove something small. When they got together, they wrote on a whiteboard that they wanted to do something important enough to land in the Computer History Museum. “We weren’t doing this to make money,” Feldman later said. They wanted to move an industry.
The idea: build a completely new class of chip purpose-built for AI — one that eliminated the fundamental bottlenecks of GPU-based computing. Many people told them it couldn’t be done. They did it anyway.
Andrew Feldman is a Stanford MBA, serial entrepreneur, and one of the more interesting founders in Silicon Valley. Before SeaMicro he ran VP roles at Force10 Networks (sold to Dell for ~$800M) and Riverstone Networks (IPO’d on Nasdaq). His first company — a gigabit ethernet startup — was sold for $280 million while he was still finishing his MBA. Elon Musk reportedly tried to buy Cerebras in 2018. They declined.
The Funding Journey
Series A — May 2016: $27 million. Led by Benchmark, Foundation Capital, and Eclipse Ventures. Valuation: ~$67 million. Nobody outside the team knew what they were building yet.
Series B — December 2016: Led by Coatue Management. The company was still in stealth.
Series C — January 2017: Led by VY Capital.
Series D — November 2018: $88 million. Investors included Altimeter, VY Capital, and Coatue. This round pushed Cerebras into unicorn status at a ~$1.7 billion valuation. Still hadn’t shown the product publicly.
Series E — Late 2019: $272 million. This is when they finally unveiled what they’d been building for four years — the Wafer Scale Engine (WSE), the largest chip ever made.
Series F — November 2021: $250 million. Led by Alpha Wave Ventures and Abu Dhabi Growth Fund. Valuation exceeded $4 billion. Total raised to date: ~$720 million.
Series G — September 2025: $1.1 billion. Led by Fidelity Management & Research and Atreides Management. Tiger Global, Valor Equity Partners, 1789 Capital, Altimeter, Alpha Wave, and Benchmark also participated. Valuation: $8.1 billion.
Series H — February 2026: $1 billion. Led by Tiger Global at a $23 billion valuation — nearly tripling in five months. Benchmark raised a dedicated $225 million SPV to increase its position. AMD, notably a competitor, invested too. Total raised across all rounds: approximately $2.8 billion.
What Cerebras Actually Builds
A standard Nvidia GPU chip is roughly the size of a fingernail. The Cerebras Wafer Scale Engine is the entire silicon wafer — about 46,000 square millimeters. The WSE-3, their current generation chip, contains 4 trillion transistors and 900,000 AI-optimized cores.
The architectural difference matters. Nvidia’s systems work by connecting many small chips together through networking fabric — which creates latency, energy overhead, and coordination complexity. Cerebras eliminates all of that by doing everything on one massive chip. Lower latency. Predictable performance. Less power consumption per token generated.
Cerebras claims the CS-3 system is 32% lower cost than Nvidia’s flagship Blackwell B200 GPU and delivers results 21x faster — accounting for both hardware capex and ongoing energy costs.
For years the company sold chips directly to customers. More recently, it pivoted to operating those chips inside its own data centers as a cloud service — customers pay for inference capacity rather than buying hardware. That’s a more scalable, recurring revenue model, and it’s why you now see Amazon, Microsoft, Google, Oracle, and CoreWeave listed as competitors in their S-1. Cerebras isn’t just a chipmaker anymore. It’s a cloud compute provider.
Headcount and Operations
708 employees as of December 31, 2025. Offices in Sunnyvale (HQ), San Diego, Toronto, and Bangalore. The company does not own its data centers — it leases infrastructure and runs its chips inside those facilities on behalf of clients.
The OpenAI Deal
This is the centerpiece of the IPO story and the deal that changed the entire narrative.
In January 2026, Cerebras announced it would provide up to 750 megawatts of computing power to OpenAI through 2028 — 250 megawatts per year. The deal is valued at over $20 billion. OpenAI also has an option to purchase an additional 1.25 gigawatts through 2030.
To fund the infrastructure needed, OpenAI loaned Cerebras $1 billion at 6% annual interest. The loan can be repaid in cash, products, or services. OpenAI also received warrants to purchase up to 33.4 million shares of Cerebras stock — but those warrants only vest in full if OpenAI actually buys 2 gigawatts of compute.
Interestingly, OpenAI CEO Sam Altman was an early personal investor in Cerebras, and the two companies had been talking since 2017. The deal culminated after Cerebras demonstrated hardware efficiency at production scale. For OpenAI, the deal reduces Nvidia dependency. For Cerebras, it provides multi-year revenue visibility and moves the customer concentration story away from its previous problem — the UAE.
In March 2026, Cerebras also signed a deal with Amazon that enables cloud services on top of Cerebras chips and allows Amazon to buy about $270 million in Cerebras stock.
The Competition
Cerebras is no longer fighting a startup battle. The competitors listed in its S-1 filing are Amazon, Microsoft, Alphabet, Oracle, and CoreWeave — every major cloud provider running AI workloads at scale.
The dominant force remains Nvidia. Its CUDA software platform has a decade-long head start and an enormous developer ecosystem.
Cerebras’s software tools are years behind CUDA — and that matters, because chips without software adoption don’t win markets. AMD has made inroads in AI infrastructure as well. Groq is a direct inference competitor also emphasizing speed.
The honest assessment: Cerebras wins on hardware performance benchmarks. Nvidia wins on ecosystem and developer inertia. The question is whether performance advantage is durable enough to pull enterprise customers through the switching cost.
The Moat
Cerebras’s moat is architectural and manufacturing-based. You cannot replicate a wafer-scale chip quickly. The yield challenges (if a defect appears anywhere on the wafer, the chip fails), the thermal management required, and the systems engineering involved took nearly a decade to get right. Eclipse, their very first investor, was told by many people that wafer-scale computing simply couldn’t be done.
The WSE-3’s on-chip memory bandwidth and ultra-low inter-core latency give it a structural advantage for inference workloads — specifically where response speed to end users matters. That is the primary AI product battleground right now.
The moat’s limits: hardware advantages erode if Nvidia closes the performance gap with future GPU generations, and software ecosystem depth remains Cerebras’s structural weakness. A hardware moat without software lock-in is a moat with a known vulnerability.
The Financials
2025 revenue: $510 million — up 76% from 2024’s $290 million. Net income in 2025: $87.9 million. That’s a dramatic swing from a $485 million net loss in 2024.
Remaining performance obligations as of December 31, 2025: $24.6 billion — contracted future revenue they expect to recognize. They expect to recognize 15% of that in 2026 and 2027.
In 2025, 62% of revenue came from one customer: Mohamed bin Zayed University of Artificial Intelligence, a public institution in the UAE. G42 accounted for 24% — down from 87% of revenue in H1 2024. Progress on concentration, but it’s still concentrated.
Will It Be a Good IPO?
The bull case is genuine. The technology works. Revenue is growing at 76% year-over-year. The company turned profitable in 2025. The OpenAI deal provides $20+ billion in multi-year revenue visibility. AMD investing in a competitor is a credibility signal that’s hard to fake. The $24.6 billion in remaining performance obligations gives public market investors something to underwrite.
The bear case is also real. Customer concentration is still the central issue — G42 plus one UAE university accounted for 86% of 2025 revenue. The OpenAI deal doesn’t eliminate concentration risk, it just shifts it. If OpenAI decides to ramp down or renegotiate, Cerebras is in trouble. The software ecosystem gap vs. Nvidia is real and won’t close fast. The company doesn’t own its data centers, which creates infrastructure dependency risk.
The valuation math is aggressive. At $23 billion on $510 million in revenue, you’re paying roughly 45x revenue. Even by AI standards, that assumes perfect execution on the OpenAI ramp, continued enterprise customer wins, and no Nvidia counter-move that erodes the performance gap.
The CFIUS/national security overhang from G42 has been largely resolved — CFIUS cleared the review, G42 is being removed from the investor list in the new filing, and the regulatory path to listing is open.
For retail investors, the honest take: this is a high-conviction, high-risk bet on a genuinely novel technology with real customers, real revenue, and real competition from the biggest companies in the world. The technology story is compelling. The customer concentration story is not fully resolved. The valuation leaves limited margin for error.
Discover more from Mukund Mohan
Subscribe to get the latest posts sent to your email.

