AMD, a chip designer, is now selling a product that aims to challenge Nvidia’s stronghold in the market for artificial intelligence processors.It estimates this market to be worth 400bn dollars by 2027.
Lisa Su, AMD’s chief executive, said that AMD’s MI300X is the “most advanced AI accelerator” in the industry. She made this claim at a San Jose event on Wednesday.
AMD had forecast that the market for AI chip sales would reach $150bn in 2027. Su stated that “it is really evident that the demand for AI chips has grown much, much more rapidly”.
The H100 Nvidia chip, which Su used to compare the MI300X in her presentation, has dominated the industry. Demand is outstripping the supply, as companies such as Amazon, Meta and Microsoft use it to create generative AI tools. Nvidia’s quarterly revenues have tripled in the last quarter.
AMD believes that its MI300 chip will be the fastest to reach $1bn sales. It expects to achieve this goal around mid-2024. The MI300A is a variant of the MI300X that will be geared towards supercomputing.
Analysts were unsure of AMD’s chances to catch up with Nvidia for the current generation AI chips when AMD announced its new chip back in June. Nvidia has already announced that they will launch the H200 chip in 2019. The company claims it is a “game changer”.
AMD’s Su invited Microsoft’s chief technology officer Kevin Scott, and Meta AI senior director of engineering Ajit Matthews to the stage to discuss how they were integrating the MI300 in their AI workloads.
OpenAI, the AI startup whose ChatGPT product generated a lot of interest in this technology, will use AMD’s new processors in its latest version Triton AI.
OpenAI’s engineer Philippe Tillet stated in a press release that “OpenAI works with and supports an open ecosystem”. In the latest version of Triton, we plan to support AMD GPUs, including MI300.
AMD also launched its ROCm 6 platform to compete with Nvidia’s proprietary Cuda platform. Su stated that “software is what drives adoption.”
Su told reporters that the $400 billion market for AI processors expected in 2027 would leave plenty of room for AMD. She said, “We believe we can get a nice slice of that.” China is included in this estimate, as the US government is cracking down on advanced AI chips.
Su stated that AMD “spends quite a bit of time” with the US Commerce Department and Biden administration. “We know that [export restrictions] for the most advanced chip are important to us — from a point of national security.”
She said that “finding an equilibrium” was the most important thing.
Dylan Patel and Daniel Nishball, of SemiAnalysis, wrote that “on raw specs, MI300X is superior to H100.” OpenAI’s announcement was a “big deal”, according to the experts, because both OpenAI as well as Microsoft will now use AMD chips for AI inference.
Nvidia didn’t immediately respond to our request for comment.
Post Disclaimer
The following content has been published by Stockmark.IT. All information utilised in the creation of this communication has been gathered from publicly available sources that we consider reliable. Nevertheless, we cannot guarantee the accuracy or completeness of this communication.
This communication is intended solely for informational purposes and should not be construed as an offer, recommendation, solicitation, inducement, or invitation by or on behalf of the Company or any affiliates to engage in any investment activities. The opinions and views expressed by the authors are their own and do not necessarily reflect those of the Company, its affiliates, or any other third party.
The services and products mentioned in this communication may not be suitable for all recipients, by continuing to read this website and its content you agree to the terms of this disclaimer.