OpenAI securing 10B dollar with Cerebras

In a deal worth over $10 billion, OpenAI has made a commitment to work with Cerebras Systems, a company that specializes in AI hardware, for over a period of several years. This move is representative of a large-scale investment in computing infrastructure in line with the rising demand for generative AI.

The deal has it that Cerebras will provide OpenAI with up to 750 megawatts of super-compute capacity in phases over the next few years, with the first deployment planned for 2028. The enormous power commitment, which is approximately equal to the continuous electricity consumption of a small city, demonstrates both the size of today’s AI workloads and OpenAI’s determination to speed up real-time inference.

Cerebras, a startup from Silicon Valley renowned for its wafer-scale engine technology, will provide the hardware that is able to run low-latency model inference with high performance. This is the process by which AI models quickly generate replies to user inquiries. OpenAI executives have characterized the alliance as part of a major effort to develop an efficient computing portfolio that could align the correct hardware to the specified AI tasks.

“Cerebras delivers an exclusively low-latency inference solution to our platform,” OpenAI compute strategist Sachin Katti stated in a company blog. He further added that the partnership will result in quicker results and more human-like communications, along with a more robust base for scaling real-time AI to a larger audience. 

For Cerebras, the deal marks a turning point that leads towards a much larger customer base. The company’s past revenues largely came from a single, major customer in the UAE, and now they have OpenAI as a client, which has products such as ChatGPT and a set of enterprise APIs. This not only energizes the startup’s commercial activities but also spreads the risk of relying on one customer.

Related Posts
×