Can any company, big or small, really topple Nvidia’s AI chip dominance? Maybe not. But there are hundreds of billions of dollars of revenue for those that can even peel off a chunk of it for themselves, Amazon CEO Andy Jassy said this week.
As expected, the company announced during the AWS Re:Invent conference the next generation of its Nvidia-competitor AI chip, Trainium3, which is four times faster yet uses less power than the current Trainium2. Jassy revealed a few tidbits about the current Trainiumin a post on Xthat shows why the company is so bullish on the chip.
He said the Trainium2 business “has substantial traction, is a multi-billion-dollar revenue run-rate business, has 1M+ chips in production, and 100K+ companies using it as the majority of Bedrock usage today.”
Bedrock is Amazon’s AI app development tool that allows companies to pick and choose among many AI models.
Jassey said Amazon’s AI chip is winning among the company’s enormous roster of cloud customers because it “has price-performance advantages over other GPU options that are compelling.” In other words, he believes it works better and costs less than those “other GPUs” out there in the market.
That is, of course, Amazon’sclassic MO, offering its own homegrown tech at lower prices.
Additionally, AWS CEO Matt Garman offered even more insight, in aninterview with CRN, about one customer responsible for a big chunk of those billions in revenue: No shock here, it’s Anthropic.
“We’ve seen some enormous traction from Trainium2, particularly from our partners at Anthropic who we’ve announced Project Rainier, where there’s over 500,000 Trainium2 chips helping them build the next generations of models for Claude,” Garman said.
Indeed, only a few U.S. companies like Google, Microsoft, Amazon, Meta have all the engineering pieces — silicon chip design expertise, homegrown high-speed interconnect and networking technology — to even attempt true competition with Nvidia. (Remember, Nvidia cornered the market on one major high-performance networking tech in 2019 when CEO Jensen Huang outbidIntel and Microsoft to buy Infiniband hardware maker, Mellanox.)
On top of that, AI models and software built to be served up by Nvidia’s chips also rely on Nvidia’s proprietary Compute Unified Device Architecture (CUDA) software. CUDA allows the apps to use the GPUs for parallel processing compute among other tasks. Just like the Intel vs. SPARC chip war of yesterday, it’s no small thing tore-write an AI app for a non-CUDA chip.
Still, Amazon may have a plan for that. As we previously reported, the next generation of its AI chip, Trainium4, will be built to interoperate with Nvidia’s GPUs in the same system. Whether that helps peel more business away from Nvidia or simply reinforces its dominance, but on AWS’s cloud, remains to be seen.
It may not matter to Amazon. If it is already on track to do multi-billion-dollars from Trainium2 chip, and the next generation will be that much better, it may be winner enough.
Source: Techcrunch



