Connect with us

AMD finds next-generation AI chips with OpenAI CEO Sam Altman

AMD CEO Lisa Su: Chip export controls are a headwind but we still see growth opportunity

Technology

AMD finds next-generation AI chips with OpenAI CEO Sam Altman

Lisa Su, CEO of Complex Micro Units, testifies throughout the Senate Trade, Science and Transportation Committee listening to titled “Winning the AI Race: Strengthening U.S. Capabilities in Computing and Innovation,” in Hart construction on Thursday, Might 8, 2025.

Tom Williams | CQ-Roll Name, Inc. | Getty Photographs

Complex Micro Units on Thursday unveiled brandnew information about its next-generation AI chips, the Intuition MI400 line, that may send subsequent month.

The MI400 chips will have the ability to be assembled right into a complete server rack known as Helios, AMD stated, which is able to permit hundreds of the chips to be attach in combination in some way that they are able to be worn as one “rack-scale” device.

“For the first time, we architected every part of the rack as a unified system,” AMD CEO Lisa Su stated at a origination match in San Jose, California, on Thursday.

OpenAI CEO Sam Altman gave the impression on level on with Su and stated his corporate would worth the AMD chips.

“When you first started telling me about the specs, I was like, there’s no way, that just sounds totally crazy,” Altman stated. “It’s gonna be an amazing thing.”

AMD’s rack-scale setup will construct the chips glance to a consumer like one device, which is noteceable for many synthetic wisdom consumers like cloud suppliers and firms that assemble massive language fashions. The ones consumers need “hyperscale” clusters of AI computer systems that may span whole knowledge facilities and worth immense quantities of energy.

“Think of Helios as really a rack that functions like a single, massive compute engine,” stated Su, evaluating it in opposition to Nvidia’s Vera Rubin racks, which might be anticipated to be absolved subsequent month.

OpenAI CEO Sam Altman poses throughout the Synthetic Logic (AI) Motion Top, on the Brilliant Palais, in Paris, on February 11, 2025. 

Joel Saget | Afp | Getty Photographs

AMD’s rack-scale generation additionally permits its fresh chips to compete with Nvidia’s Blackwell chips, which already are available in configurations with 72 graphics-processing devices stitched in combination. Nvidia is AMD’s number one and best rival in obese knowledge middle GPUs for growing and deploying AI programs.

OpenAI — a impressive Nvidia buyer — has been giving AMD comments on its MI400 roadmap, the chip corporate stated. With the MI400 chips and this month’s MI355X chips, AMD is making plans to compete in opposition to rival Nvidia on worth, with an organization government telling journalists on Wednesday that the chips will price much less to perform because of decrease energy intake, and that AMD is undercutting Nvidia with “aggressive” costs.

Up to now, Nvidia has ruled the marketplace for knowledge middle GPUs, in part as it was once the primary corporate to assemble the type of tool wanted for AI builders to make the most of chips at first designed to show graphics for 3-d video games. Over the pace decade, earlier than the AI increase, AMD serious about competing in opposition to Intel in server CPUs.

Su stated that AMD’s MI355X can outperform Nvidia’s Blackwell chips, in spite of Nvidia the usage of its “proprietary” CUDA tool.

“It says that we have really strong hardware, which we always knew, but it also shows that the open software frameworks have made tremendous progress,” Su stated.

AMD stocks are flat up to now in 2025, signaling that Wall Boulevard doesn’t but see it as a big blackmail to Nvidia’s dominance.

Andrew Dieckmann, AMD’s common manger for knowledge middle GPUs, stated Wednesday that AMD’s AI chips would price much less to perform and not more to procure.

“Across the board, there is a meaningful cost of acquisition delta that we then layer on our performance competitive advantage on top of, so significant double-digit percentage savings,” Dieckmann stated.

Over the subsequent few years, obese cloud corporations and nations similar are prepared to spend loads of billions of greenbacks to assemble brandnew knowledge middle clusters round GPUs to bring to boost up the improvement of state-of-the-art AI fashions. That comes with $300 billion this month abandoned in deliberate capital expenditures from megacap generation corporations.

AMD is anticipating the whole marketplace for AI chips to exceed $500 billion via 2028, even if it hasn’t stated how a lot of that put it on the market can declare — Nvidia has over 90% of the marketplace lately, consistent with analyst estimates.

Each corporations have dedicated to freeing brandnew AI chips on an annual foundation, versus a biannual foundation, emphasizing how fierce festival has transform and the way noteceable bleeding-edge AI chip generation is for corporations like Microsoft, Oracle and Amazon.

AMD has purchased or invested in 25 AI corporations within the pace month, Su stated, together with the purchase of ZT Systems earlier this year, a server maker that advanced the generation AMD had to assemble its rack-sized techniques.

“These AI systems are getting super complicated, and full-stack solutions are really critical,” Su stated.

What AMD is promoting now

These days, essentially the most complex AMD AI chip being put in from cloud suppliers is its Intuition MI355X, which the corporate stated began transport in manufacturing extreme generation. AMD stated that it could be to be had for hire from cloud suppliers settingup within the 3rd quarter.

Corporations construction massive knowledge middle clusters for AI need possible choices to Nvidia, no longer best to store prices ill and grant flexibility, but in addition to fill a rising want for “inference,” or the computing energy wanted for in fact deploying a chatbot or generative AI utility, which is able to worth a lot more processing energy than conventional server programs.

“What has really changed is the demand for inference has grown significantly,” Su stated.

AMD officers stated Thursday that they imagine their brandnew chips are stunning for inference to Nvidia’s. That’s as a result of AMD’s chips are provided with extra high-speed reminiscence, which permits larger AI fashions to run on a unmarried GPU.

The MI355X has seven occasions the quantity of computing energy as its predecessor, AMD stated. The ones chips will have the ability to compete with Nvidia’s B100 and B200 chips, which were transport since overdue extreme month.

AMD stated that its Intuition chips had been followed via seven of the ten greatest AI consumers, together with OpenAI, Tesla, xAI, and Cohere.

Oracle plans to trade in clusters with over 131,000 MI355X chips to its consumers, AMD stated.

Officers from Meta stated Thursday that they had been the usage of clusters of AMD’s CPUs and GPUs to run inference for its Llama fashion, and that it plans to shop for AMD’s next-generation servers.

A Microsoft consultant stated that it makes use of AMD chips to provide its Copilot AI options.

Competing on worth

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Technology

To Top