OpenAI to use Cerebras chips for inference; deal would consume 750 megawatts
OpenAI said on Wednesday that it would begin using chips from Cerebras, a start-up in Sunnyvale, Calif., and that the number of Cerebras chips it eventually used would consume 750 megawatts of electricity, an amount the company said could power tens of thousands of households. The Cerebras chips will be used for inference, OpenAI said.
The agreement is the latest in a series as OpenAI expands computing power: the company has signed deals to use chips from Nvidia and AMD and is designing its own chips with Broadcom. OpenAI previously said it would deploy enough Nvidia and AMD chips to consume 16 gigawatts and that the chips it is designing with Broadcom are slated to consume 10 gigawatts.
Industry-wide, OpenAI, Amazon, Google, Meta and Microsoft plan to spend more than $325 billion combined on new A.I. data center facilities by the end of this year, and OpenAI is building data centers in Abilene, Texas, with plans for additional facilities in other parts of Texas, New Mexico, Ohio and the Midwest.
Greg Brockman, OpenAI’s president, said in a statement, "This partnership will make ChatGPT not just the most capable but also the fastest A.I. platform in the world." Cerebras, co-founded in 2015 by Andrew Feldman, unveiled in 2019 what it called the largest chip ever built — about the size of a dinner plate and about 100 times the size of a typical chip — and has focused on keeping data on one giant chip to speed operations.
Key Topics
Tech, Openai, Cerebras, Chatgpt, Nvidia, Broadcom