Amazon.com is developing its own artificial intelligence (AI) chips at its chip lab in Austin, Texas, aiming to reduce reliance on Nvidia’s costly processors. The new server design, packed with Amazon’s AI chips, is part of the company’s strategy to provide more affordable solutions for complex computations and data processing in its Amazon Web Services (AWS) cloud business. This move aligns with similar efforts by competitors Microsoft and Alphabet.
Rami Sinno, director of engineering at Amazon’s Annapurna Labs, which was acquired in 2015, highlighted the growing demand from customers for cheaper alternatives to Nvidia chips. While Amazon’s AI chip efforts, including Trainium and Inferentia, are relatively new, the company has been developing its Graviton chip for non-AI computing for nearly a decade. David Brown, Vice President of Compute and Networking at AWS, noted that Amazon’s chips could offer significant cost savings, potentially reducing costs by 40% to 50% compared to Nvidia.
AWS, a key driver of Amazon’s revenue, reported a 17% increase in sales, reaching $25 billion in the first quarter of the year. AWS holds a significant market share in cloud computing, with roughly a third of the market, compared to Microsoft’s Azure at 25%. During its recent Prime Day event, Amazon deployed a substantial number of its custom chips, including 250,000 Graviton chips and 80,000 AI chips, to manage the increased activity on its platforms.