AMD Sharpens Silicon Swords To Take On Chip And AI Rivals
Comment Once the relative minnow of the chip industry, AMD senses blood in the water following a series of missteps by arch-rival Intel, and head honcho Lisa Su is wasting no time in talking up its game plan to investors.
The fabless semiconductor biz has long played second fiddle to that other Santa Clara chipmaker, as well as Nvidia in the GPU accelerator stakes. But in its most recent results, AMD reported datacenter revenues up 115 percent and forecast it would make $4.5 billion from GPUs this year.
Speaking at the Goldman Sachs Communacopia and Technology Conference this week, president and CEO Su confirmed that "AI is a huge priority for us," but added that end-to-end support for the technology is key.
"I'm a big believer that there's no-one-size-fits-all in terms of computing. And so our goal is to be the high-performance computing leader that goes across GPUs and CPUs and FPGAs and also custom silicon as you put all of that together," she said.
It can't hurt that AMD's biggest rival is going through a tough patch at the moment, though Su did not mention Intel and surprisingly – or perhaps by design – the subject wasn't brought up by the Goldman Sachs host.
Billions are being spent on AI, but Su admitted the industry is still at an early stage in this particular compute cycle, and predicted there will be continued demand for more powerful infrastructure.
"Whether you're talking about training of large language models or you're talking about inferencing or you're talking about fine tuning or you're talking about all of these things; the workloads will demand more compute," she claimed, adding that it isn't just about GPUs.
"We believe GPUs will be the largest piece of that [forecast] $400 billion total addressable market, but there will also be some custom silicon associated with that. And when we look at our opportunity there, it really is an end-to-end play across all of the different compute elements."
AMD moved to a one-year cadence for GPUs partly to keep up with Nvidia, but the CEO claimed it was also so that AMD could bulk out its portfolio with products covering the gamut of needs.
"Of course, you have the largest hyperscalers who are building out these huge training clusters, but you also have a lot of need for inference, some are more memory-intensive workloads that would really focus there, some are more datacenter power constrained," she explained.
"So what we've been able to do with our MI325 that's planned to launch here in the fourth quarter, and then the MI350 series and the MI400 series, is really just broaden the different products such that we are able to capture a majority of the addressable market with our product road map."
Su also outlined AMD's bid to become a key supplier of AI chips for the hyperscalers as well as the largest enterprise customers, part of which includes the pending purchase of ZT Systems, a company that makes high-performance servers for cloud operators.
"We're continuing to build out the entire infrastructure of what we need. So we just recently announced several software acquisitions, including the acquisition of Silo AI, which is a leading AI software company. And we just recently announced the acquisition of ZT Systems, which also builds out sort of the rack scale infrastructure necessary."
This decision was prompted by talking to customers and looking at what would be necessary three to five years down the road, Su claimed.
"The rackscale infrastructure – because these AI systems are getting so complicated – really needs to be thought of in design, sort of at the same time in parallel with the silicon infrastructure. So we're very excited about the acquisition of ZT," she explained, adding that "the knowledge of what are we trying to do on the system level will help us design a stronger and more capable road map."
The other advantage will be on speeding validation of the various components required for AI infrastructure.
"The amount of time it takes to stand up these clusters is pretty significant. We found, in the case of MI300, we finished our validation, but customers needed to do their own validation cycle. And much of that was done in series, whereas with ZT as part of AMD, we'll be able to do much of that in parallel. And that time to market will allow us to go from design complete to large-scale systems, running production workloads in a shorter amount of time, which will be very beneficial to our customers."
- Qualcomm guns for Intel, AMD with cheaper 8-core X chips
- AMD's Victor Peng: AI thirst for power underscores the need for efficient silicon
- AMD internal data reportedly offered for sale
- Gamers who find Ryzen 9000s disappointingly slow are testing it wrong, says AMD
Another piece of the puzzle is software, and the chipmaker has another pending purchase here in Finnish developer Silo AI.
AMD has been honing its own GPU software stack, ROCm, to compete with market kingpin Nvidia, whose associated tools including the CUDA platform have almost become a de facto standard.
"Over the last nine or ten months, we've spent a tremendous amount of time on leading workloads. And what we found is, with each iteration of ROCm, we're getting better and better... in terms of the tools, in terms of all the libraries, [and] in terms of knowing where the bottlenecks are in terms of performance," Su claimed.
"We've been able to demonstrate with some of the most challenging workloads that we've consistently improved performance. And in some cases, we've reached parity, in many cases, we've actually exceeded our competition, especially with some of the inference workloads because of our architecture, we have more memory bandwidth and memory capacity."
But AI isn't just a datacenter concern as the AI PC is also one of the hyped trends the industry has been desperately pushing this year in an effort to pep up flagging desktop and laptop sales.
"I believe that we are at the start of a multiyear AI PC cycle," Su told the conference.
"We never said AI PCs was a big 2024 phenomena. AI PCs is [making] a start in 2024. But more importantly, it's the most significant innovation that's come to the PC market in definitely the last ten-plus years," she opined, adding that it represents an opportunity for her company – traditionally the underdog in the PC market – especially as there is so much confusion around even the definition of an AI PC.
"We find that many enterprise customers are pulling us into their AI conversations. Because, frankly, enterprise customers want help, right? They want to know, 'Hey, how should I think about this investment? Should I be thinking about cloud or should I be thinking about on-prem or how do I think about AI PCs?' And so we found ourselves now in a place of more like a trusted adviser with some of these enterprise accounts."
Speaking more broadly, Su said that AI is an opportunity that cannot really be ignored by AMD.
"I think this AI sort of technology arc is really a once-in-50-years type thing, so we have to invest. That being the case, we will be very disciplined in that investment. And so we expect to grow opex slower than we grow revenue. But we do see a huge opportunity in front of us."
Su ended with her sales pitch, of sorts, with what enterprise customers need to bear in mind in the current time.
"This is a computing super cycle, so we should all recognize that. And there is no one player or architecture that's going to take over. I think this is a case where having the right compute for the right workload and the right application is super important."
And that's what AMD has been working towards over the last five-plus years, she claimed – to have the best CPU, GPU, FPGA, and semi-custom capabilities to meet customer needs.
This is in marked contrast to the past half decade at Intel, which was typified by delays of 10nm chips, a 7nm process node that was well behind schedule, the return of prodical son Pat Gelsinger, this time as CEO, who designed a blueprint for future success which he is now redrafting, and plunging market valuation. ®
From Chip War To Cloud War: The Next Frontier In Global Tech Competition
The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more
The High Stakes Of Tech Regulation: Security Risks And Market Dynamics
The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more
The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics
Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more
The Data Crunch In AI: Strategies For Sustainability
Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more
Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser
After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more
LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue
In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more