This is not investment advice. The author has no position in any of the stocks mentioned. Wccftech.com has a disclosure and ethics policy.

According to a report from the Financial Times, Amazon is developing custom artificial intelligence chips to reduce its dependence on NVIDIA. The firm has already developed a variety of in-house processors to run data center workloads, and the latest push is part of its investment in a chip design startup in 2015. Amazon is expected to shed more light on its custom AI processors next month as part of announcements covering the firm’s Trainium chip lineup.

These chips have been developed by Amazon’s Annapurna Labs, and they are being used by Microsoft-backed OpenAI’s rival Anthropic. Anthropic is Amazon’s primary AI partner, and it provides the e-commerce and cloud computing giant with access to the Claude foundational AI model.

Amazon Pushes For Custom AI Chips To Lower Costs & Dependence On NVIDIA

Today’s report is just one of many that indicate a push in big tech to reduce reliance on NVIDIA for the most powerful artificial intelligence processors. NVIDIA’s GPUs are market leaders and top performers in running AI workloads. Consequently, high demand and constrained supply have made them one of the most highly sought-after and expensive products in the world.

For Amazon, developing in-house AI chips is an effort to reduce dependence on NVIDIA’s products and simultaneously reduce costs, reports the Financial Times. The firm is not inexperienced when it comes to developing custom chips. Its acquisition of chip design startup Annapurna has enabled Amazon to churn out a steady stream of processors to reduce costs of using AMD and Intel’s products for traditional data center workloads.

These chips, called the Graviton processors, are complemented by Amazon’s custom AI processors called Trainium. Trainium is designed to work with large language models, and Amazon unveiled Trainium2 a year ago in November 2023.

An Amazon slide describing the Graviton3 chip. Image: Amazon

As per the FT, Annapurna is also leading the effort to develop the chips that will reduce Amazon’s dependence on NVIDIA’s GPUs. While the FT’s report shares few details about these chips, it does outline that Amazon can provide insights into them next month at its event covering the Trainium2 chips. While Trainium2 was launched in 2023, supply constraints have limited its adoption. FT reports that Amazon’s AI partner, Anthropic, is using Trainium2.

Amazon’s chips are designed using technology from the Taiwanese firm Alchip. They are manufactured by the Taiwan Semiconductor Manufacturing Company (TSMC), and Amazon shared last year that more than 50,000 AWS customers were using its Graviton chips.

Along with Amazon, other mega-cap firms, including Google’s parent Alphabet and Facebook’s owner Meta, also self-develop AI chips. Industry players like Apple use Google’s chips, and Meta unveiled its second-generation Meta Training and Inference Accelerator (MTIA) earlier this year. Both of these reduce dependence on NVIDIA’s GPUs, and Microsoft-backed OpenAI is also reportedly considering developing in-house chips.

Google unveiled its latest tensor processing unit (TPU) AI chip, Trillium, earlier this month. These chips offer four times faster AI training performance and three times faster inference than their predecessors, according to the company.

Share this story

Facebook

Twitter