Crunch time… AI chips are still the hottest tech in town, powering everything from Big Tech chatbots to autonomous cars and delivery robots. AI processors are critical for training the large language models that fuel tools like ChatGPT, which OpenAI said has 200M+ weekly users. Nvidia’s been selling shovels to the gold rush: Alphabet, Microsoft, Amazon, and Meta have spent tens of billions to get their hands on its powerful AI chips (FYI: Big Techies account for ~40% of Nvidia’s sales). Global AI chip revenue is expected to top $70B this year and could reach $400B/year by 2030.
Connecting the circuits… Nvidia’s estimated to control between 70 and 95% of the AI-chip market, but its sales growth is cooling and competition’s revving up. Intel and AMD have announced their own genAI chips. Meta, OpenAI, and Microsoft said they’d use AMD’s chip, which CEO Lisa Su said could notch $4.5B in sales this year. Last month, Intel scored a multiyear deal to make a custom AI chip for Amazon. Microsoft, Meta, and Google (aka Nvidia’s biggest customers) have also whipped up their own AI training chips to start weaning off Nvidia’s pricey GPUs. Last month, California-based startup Cerebras Systems filed for a US IPO after debuting what it calls the “the fastest AI chip in the world.”
The silicone future… Chip demand is expected to power up as companies lean on AI-infused phones and laptops to boost sagging hardware sales. Last month Apple unveiled the first iPhone designed for generative AI (#AiPhone). Rival Samsung debuted its AI-powered Galaxy phone this year, and Dell and Microsoft have rolled AI PCs. While AI hardware has yet to go mainstream, experts say the smart device surge could spark another chip shortage.