
According to a report by The Motley Fool on April 27, AI chipmaker Cerebras Systems’ S-1 filing shows that the company’s remaining performance obligations (RPO) total $24.6 billion, with full-year 2025 revenue of $510 million. The report states that the company’s targeted IPO valuation is about $35 billion, implying a price-to-sales ratio of roughly 70x.
According to Cerebras’ S-1 filing, Cerebras’ wafer-scale engine (Wafer-Scale Engine, WSE) architecture uses the entire silicon wafer. The chip size is about 30 times that of the Nvidia Blackwell B200 package, and the number of transistors on each chip is 19 times that of the Nvidia Blackwell B200. Per the company’s technical statements, Cerebras’ wafer-scale chips deliver inference speeds that are 15 times faster than existing leading GPU solutions.
According to Cerebras’ S-1 filing, OpenAI has signed an agreement worth $20 billion with Cerebras, under which it will purchase 750 megawatts (MW) of AI inference capacity between 2026 and 2028, with an option to add another 1.25 gigawatts (GW) of capacity by the end of 2030. In addition, Amazon has signed an agreement to bring Cerebras’ CS-3 systems into Amazon Web Services (AWS) for inference capabilities integrated with Amazon’s Trainium3 chips.
After completing the two transactions above, Cerebras’ total remaining performance obligations amount to $24.6 billion, and full-year 2025 revenue is $510 million.
According to The Motley Fool’s report, Cerebras’ S-1 filing does not disclose an official valuation. However, the reported target valuation is about $35 billion, implying a price-to-sales ratio of roughly 70x. For comparison, Nvidia’s price-to-sales ratio over the same period is 23x, and the price-to-earnings multiple for expected 2026 profits is 13x.
Per the S-1 filing, the manufacturing of Cerebras wafer-scale chips relies entirely on Taiwan Semiconductor Manufacturing Company (TSMC). It currently uses the 5-nanometer process, and TSMC is upgrading the relevant process to 3 nanometers. The S-1 filing confirms that no other foundries currently can produce Cerebras wafer-scale chips. Because yields for wafer-scale chips are lower than for smaller chips, expanding production scale presents challenges in both cost and technology.
According to the S-1 filing submitted by Cerebras, the Cerebras WSE chip size is about 30 times that of the Nvidia Blackwell B200, and the number of transistors is 19 times that of the latter. Cerebras states that its inference speed is 15 times faster than existing leading GPU solutions.
According to Cerebras’ S-1 filing, OpenAI has signed a $20 billion agreement, under which it will purchase 750 megawatts of AI inference capacity between 2026 and 2028, with an option to add an additional 1.25 gigawatts of capacity by the end of 2030.
According to The Motley Fool’s report on April 27, Cerebras has reportedly targeted a valuation of about $35 billion (the S-1 filing does not disclose an official valuation). Full-year 2025 revenue is $510 million, and remaining performance obligations total $24.6 billion.
Related Articles
Colombia's Largest Pension Fund Launches Bitcoin ETF Product with $25 Minimum Investment
Bitcoin ETFs See $202.41M Daily Outflow, Ethereum and Solana ETFs Also Record Net Withdrawals
Germany Excludes Palantir from Military Cloud Project Over Data Sovereignty Concerns
NYSE Arca proposal: Crypto ETF 85% qualifying asset threshold—BTC, ETH, SOL, XRP clear the bar
U.S. Stock Market Opens Mixed as Crypto Stocks Decline; MSTR Down 3.36%
Sunwoda Q1 Net Profit Plunges Over 40% YoY, Energy Storage Margins Improve Sequentially