Cerebras Systems, the AI chipmaker behind the world’s largest single-die semiconductor processor, filed publicly for a U.S. initial public offering on Friday, April 17, targeting a NASDAQ listing under ticker CBRS in May 2026. The filing arrived exactly one day after OpenAI committed more than $20 billion over three years for servers powered by Cerebras chips.
The Filing
According to CNBC, Cerebras reported $87.9 million in net income on $510 million in revenue during 2025, a dramatic turnaround from a $485 million net loss the prior year. Revenue grew nearly 76% year-over-year. The company disclosed $24.6 billion in remaining performance obligations as of December 31, 2025, with 15% expected to be recognized across 2026 and 2027, per the S-1 filing on SEC EDGAR.
Morgan Stanley, Citigroup, Barclays, and UBS are among the top underwriters. Techi reports secondary-market pricing on Forge and Hiive recently quoted shares at $102-$107, implying a $26-$28 billion private valuation ahead of the public debut.
This is Cerebras’ second attempt. The company filed in September 2024 but withdrew the paperwork in October 2025 to add information on financial performance and strategy.
The OpenAI Contract
The timing is strategic. On April 16, one day before the IPO filing, OpenAI’s expanded relationship with Cerebras became public: a deal worth more than $20 billion covering up to 750 megawatts of AI computing power through 2028, according to CNBC. The contract calls for Cerebras making 250 megawatts available each year between 2026 and 2028, with an option for OpenAI to purchase an additional 1.25 gigawatts through 2030.
The deal includes warrants: in December, Cerebras issued OpenAI warrants to purchase up to 33.4 million shares of non-voting Class N stock, vesting in full only if OpenAI buys 2 gigawatts of computing power. In January, Cerebras received a $1 billion loan from OpenAI at 6% annual interest to build data center infrastructure.
Cerebras acknowledged the concentration risk in the filing: the OpenAI alliance “represents a substantial portion of our projected revenues over the next several years,” and OpenAI retains the right to terminate part or all of the agreement if Cerebras fails to deliver computing power on time.
Customer Concentration Shifting
The revenue concentration story is evolving. When Cerebras first filed in 2024, one customer, Microsoft-backed G42 based in the United Arab Emirates, contributed 87% of revenue for the first half of that year. In 2025, G42 dropped to 24% of revenue, but Mohamed bin Zayed University of Artificial Intelligence, also UAE-based, provided 62%, according to the filing.
The company has also signed a deal with Amazon enabling cloud services on Cerebras chips, with Amazon purchasing approximately $270 million in Class N stock. Cerebras now counts Amazon, Microsoft, Alphabet, Oracle, and CoreWeave among its competitors as it operates chips inside its own data centers as a cloud service.
The Chip
Cerebras builds the Wafer Scale Engine 3 (WSE-3), a processor that spans nearly an entire 300mm silicon wafer at 46,255 square millimeters. It contains 4 trillion transistors across 900,000 AI-optimized cores, according to Techi. The company claims training capability for models up to 24 trillion parameters without the complex parallelization software that GPU clusters require.
The IPO Landscape
The New York Times and Reuters noted the filing arrives as SpaceX, Anthropic, and OpenAI prepare for their own listings, part of a broader wave of AI company public offerings in 2026. Cerebras was founded in 2016 by Andrew Feldman and four other veterans of SeaMicro, a server startup AMD acquired for $355 million in 2012. The company has 708 employees and is based in Sunnyvale, California. Its investor list includes Alpha Wave, Benchmark, Eclipse, Fidelity, Foundation Capital, and OpenAI CEO Sam Altman.
The Compute Supply Chain Question
For teams building and deploying AI agents, Cerebras going public introduces a second publicly traded option for dedicated AI compute infrastructure beyond Nvidia. The OpenAI contract demonstrates that at least one frontier lab is actively hedging its Nvidia dependence. Whether Cerebras can deliver competitive cost-per-token at production scale, and whether its software ecosystem can close the gap with Nvidia’s CUDA platform, will determine how much the agent infrastructure supply chain actually diversifies.