Most computer chips are the size of a fingernail, but Cerebras Systems builds a chip the size of a dinner plate. It is a piece of engineering that ignores standard manufacturing rules, yet investors just bet another billion dollars that this giant square of silicon is the only real threat to Nvidia. The question is whether the physics or the politics will break first.
Key Takeaways
- Cerebras raised $1 billion in capital, reaching a $23 billion valuation.
- Benchmark Capital invested $225 million through two dedicated Benchmark Infrastructure funding vehicles.
- Cerebras signed a $10 billion deal to provide 750 megawatts of computing power to OpenAI.
Cerebras Systems has raised $1 billion in new capital. This pushes the company’s valuation to $23 billion, which is nearly triple what it was worth six months ago. The round was led by Tiger Global, but the most interesting check came from Benchmark Capital.
Benchmark usually caps its funds at $450 million. To make this work, they created special investment vehicles just to funnel $225 million into Cerebras. This suggests high confidence from one of Silicon Valley’s most disciplined firms.
The big deal
Nvidia currently controls the market for AI hardware. Tech companies are desperate for a viable alternative to keep costs down and supply up. Cerebras is positioning itself as that alternative. The company claims its systems are faster than Nvidia’s chips for specific AI tasks.
This is not just talk. OpenAI has signed a multi-year agreement worth roughly $10 billion with Cerebras. The deal secures 750 megawatts of computing power through 2028. OpenAI CEO Sam Altman is also an investor in the company. This partnership validates the technology in a way that press releases cannot.
How it works
Standard chip manufacturing takes a 300-millimeter silicon wafer and cuts it into hundreds of tiny, separate chips. Cerebras does the opposite. They use the entire wafer to make a single, massive processor called the Wafer Scale Engine. It is about 8.5 inches wide and holds 4 trillion transistors.
Think of a standard computer cluster like a fleet of delivery trucks. The trucks are fast, but they spend a lot of time stuck in traffic moving packages between different warehouses. Cerebras is like building one giant warehouse where all the workers are in the same room. No trucks are needed.
Because the data does not have to travel between separate chips, the system avoids the communication bottlenecks that slow down traditional GPU clusters. Cerebras says this design makes AI inference more than 20 times faster.
Inference: The process where a trained AI model answers a user’s question or performs a task.
The catch
The technology works, but the business side has baggage. Until recently, a single client in the UAE called G42 provided 87% of Cerebras’ revenue. This heavy reliance on one customer is risky.
It also attracted government attention. G42 has historical ties to Chinese technology companies, which triggered a national security review by the Committee on Foreign Investment in the United States. This scrutiny forced Cerebras to delay its initial plans for an IPO and withdraw a filing earlier in 2025. G42 has since been removed from the investor list to clear the path forward.
What now?
Cerebras is preparing to go public in the second quarter of 2026. They must now prove they can diversify their customer base beyond the UAE and OpenAI.
If you use ChatGPT, this deal is designed to make the system answer complex queries faster in the coming years. Watch to see if other major AI labs sign similar deals, or if OpenAI remains the primary user of this giant chip.
