AI hardware startup Cerebras Systems has filed for an initial public offering (IPO), marking a significant step for a company that has positioned itself as a challenger to Nvidia in the high-stakes market for artificial intelligence chips. The filing comes on the heels of major commercial agreements with Amazon Web Services (AWS) and OpenAI, the latter reportedly worth over $10 billion.

According to the filing, Cerebras generated $510 million in revenue in 2025. The company reported a net income of $237.8 million, though this figure includes certain one-time items; on a non-GAAP basis, which excludes these items, the company recorded a net loss of $75.7 million. A company spokesperson indicated the IPO is planned for mid-May, though the specific amount it hopes to raise has not been disclosed.

From Delayed Offering to $23 Billion Valuation

This is not Cerebras's first attempt to go public. The company, led by CEO Andrew Feldman, initially filed for an IPO in 2024. That process was delayed due to a federal review of an investment from Abu Dhabi-based technology conglomerate G42 and was ultimately withdrawn. The path cleared for the current filing after Cerebras secured substantial private funding, including a $1.1 billion Series G round last year and a $1 billion Series H in February 2026, which valued the company at $23 billion.

In a recent interview with the Wall Street Journal, Feldman framed the company's success in competitive terms. "Obviously, [Nvidia] didn’t want to lose the fast inference business at OpenAI, and we took that from them," he stated, referring to the lucrative deal with the AI research lab.

Strategic Partnerships Underpin Growth

The IPO filing is underpinned by two landmark commercial agreements announced in recent months. The first is a deal with Amazon Web Services to integrate Cerebras's specialised chips into Amazon's data centres, significantly expanding the startup's potential market reach. The second, and more headline-grabbing, is the multi-year agreement with OpenAI to supply chips for AI model training and inference.

Box: What are AI Training and Inference Chips?
AI chips are specialised processors designed to handle the immense computational workloads required for artificial intelligence. Training involves teaching an AI model by processing vast datasets, a process that can take weeks or months on thousands of chips. Inference is the subsequent phase where the trained model makes predictions or generates content in response to user queries, requiring fast, efficient processing.

Context and Competitive Landscape

Cerebras's journey to an IPO reflects the intense competition and soaring valuations in the AI hardware sector, long dominated by Nvidia. The company's flagship product, the Wafer Scale Engine, is among the largest chips ever built and is designed specifically to accelerate AI workloads. The securing of OpenAI as a client represents a major coup, as the lab's computing needs are among the most demanding in the world and its vendor choices heavily influence the broader industry.

The filing reveals a company transitioning from a research and development focus to a commercial growth phase, with its recent partnerships serving as validation of its technology. The planned mid-May offering will test investor appetite for a pure-play AI hardware company at a time of heightened market interest in artificial intelligence infrastructure.