Google is reportedly pitching its AI chips to data centers
Nvidia unquestionably dominates the current AI chip market, making the most popular GPUs to train and run today’s AI models.
But the companies paying hundreds of billions for those powerful GPUs are all working on their own custom chips.
Amazon has been iterating on its Trainium and Inferentia chips, promising favorable performance for the price. OpenAI has been quietly working on designs for its own chips, and even Microsoft is hedging its bets with its own AI chip design.
Today, The Information reports that Google has been reaching out to data center providers to get its own custom chips in racks to rent to customers. The company is shopping around its custom tensor processing units (TPUs) to cloud computing providers like CoreWeave, Crusoe, and Fluidstack, according to the report.
With Nvidia sitting comfortably at the top of the mountain, companies are looking to reduce their dependence on one supplier for the specialized hardware that powers their AI businesses.
But the companies paying hundreds of billions for those powerful GPUs are all working on their own custom chips.
Amazon has been iterating on its Trainium and Inferentia chips, promising favorable performance for the price. OpenAI has been quietly working on designs for its own chips, and even Microsoft is hedging its bets with its own AI chip design.
Today, The Information reports that Google has been reaching out to data center providers to get its own custom chips in racks to rent to customers. The company is shopping around its custom tensor processing units (TPUs) to cloud computing providers like CoreWeave, Crusoe, and Fluidstack, according to the report.
With Nvidia sitting comfortably at the top of the mountain, companies are looking to reduce their dependence on one supplier for the specialized hardware that powers their AI businesses.