By Max A. Cherney
SAN FRANCISCO (Reuters) -Crusoe CEO Chase Lochmiller said on Thursday the artificial intelligence data center builder planned to purchase roughly $400 million worth of AI chips from Advanced Micro Devices to add to its portfolio of computing power.
The cloud computing startup is aiming to build a data center – or cluster – to house the AMD AI chips in the U.S. and will rent them to customers for building AI models and running applications. Crusoe plans to purchase roughly 13,000 MI355X chips and use a liquid cooling system.
The chips will be housed in a single facility that will come online in the fall that can be divvied up to several customers, or a single customer if there is one that wants to use the entire cluster, Lochmiller told Reuters in an interview.
Crusoe’s data centers are purpose-built to house AI chips, which it says allows it to offer superior performance compared with older designs. The company has developed pre-fabricated components to speed data center construction, similar to how pre-fabricated homes reduce construction time.
“Where startups lack in scale and people and capital, compared to some of the hyperscalers, where we can compete is actually being nimble, fast and have a high density of engineering talent,” Lochmiller said.
AMD’s AI chips offer an alternative to the hardware sold by Nvidia, which has dominated the market. The AMD MI355X chips Crusoe plans to purchase include a large amount of high-bandwidth memory, which makes them well suited for running AI applications, which is known as inference.
Crusoe is among a crop of companies that are building cloud computing services specifically for AI companies, usually by amassing large numbers of Nvidia chips. Nvidia has armed this group of newer companies with AI chips, in part, because it allows them to diversify its revenue away from cloud computing giants such as Microsoft.
Now, AMD is attempting to execute a similar plan.
“I think it’s reaffirming of the neocloud strategy,” Lochmiller said. “These new platforms have a lot of value to add to the ecosystem by providing infrastructure to big users of AI,” Lochmiller said.
(Reporting by Max A. Cherney in San FranciscoEditing by Shri Navaratnam)