Is This The Year 100GE NICs Go Mainstream? If You're Into AI, It Might Be
The growing popularity of generative AI and availability of smart features in virtualization platforms like VMware's vSphere will help to drive faster networking into enterprise servers in 2023.
Dell'Oro Group analyst Barron Fung predicts that by the end of 2023, 100Gbps-or-faster Ethernet NICs will be responsible for nearly half all revenues in the segment, despite amounting to less than 20 percent of cards sold.
Today, hyperscalers and cloud providers are the largest customers of this class of NIC, because they routinely deal with massive data flows while serving customer workloads. This is changing to an extent, Fung said, adding that in the enterprise, 25Gbps NICs would likely remain the sweet spot through 2023, except in the case of certain, targeted applications.
An obvious one is all-flash storage clusters. Fung also notes that the boom in machine learning, spurred in part by excitement around ChatGPT, Midjourney, and other generative AI models, will likely drive demand for faster networking.
He said that because AI/ML workloads are often spread across multiple GPU nodes and potentially even multiple racks, they usually require considerably higher bandwidth. For instance, Nvidia's DGX H100 sports eight 400Gbps NICs, one for each of its 700W H100 GPUs.
It's a NIC, but smarter
Fung said he expects enterprises will start deploying smartNICs in greater numbers this year as software platforms that are able to take advantage of them become more prevalent.
The smartNICs – sometimes called data or infrastructure processing units (DPU/IPU) – typically combine high-speed networking and a number of fixed-function ASICs or a configurable FPGA with general-purpose compute cores. The goal is to pick up work from the host CPU that would otherwise consume clock cycles.
The problem until recently is that unless you had the resources of a cloud provider or were willing to write your own software to take advantage of their smart features, they were just an expensive NIC with a lot of untapped potential. However, several software vendors, including VMware and Redhat, have launched initiatives in recent years to change that.
- Nvidia in blast radius as Uncle Sam looks to cut off China's Huawei for good
- Microsoft boosts Azure networking, storage with composable infra grab
- What are Nvidia and AMD doing getting involved with SmartNICs and VMware?
- While Intel XPUs are delayed, here's some more FPGAs to tide you over
One of the first was VMware's Project Monterey, which functions a bit like an abstraction layer, allowing users to tap directly into hardware acceleration for things like storage, networking, or security from within ESXi and vSphere 8. From the end user's perspective everything works as normal, while under the hood, VMware's platform flips the switches and knobs necessary to take advantage of the card's onboard compute.
"I think we may get more traction from the DPU and smartNICs this year. Things like Project Monterrey deliver a lot of benefits," Fung said.
You better bundle it
But whether those benefits will outweigh the costs remains to be seen. While declining demand has driven average selling prices down in many markets – particularly in memory – Fung doesn't expect that to be the case in the Ethernet adapter market.
In fact, Dell'Oro predicts NIC revenues will achieve double-digit growth in 2023 even though shipments are expected to decline 9 percent year over year. So what gives? According to Fung, there are a number of factors at play, but one appears to be that due to declining demand, vendors are prioritizing higher-end equipment, with faster 100Gbps, 200Gbps, or even 400Gbps interfaces and features that keep average selling prices up.
Although that may sound like a raw deal, Fung notes that while NICs are getting more expensive, the cost per bit tends to be lower on these cards. This means if the server can take advantage of the higher speeds, it may actually end up being cheaper than using multiple slower, less expensive NICs. Or in the case of smartNICs, the hardware acceleration may allow customers to get away with lower end CPUs than would be possible using a standard NIC.
Fung added that because most IT managers are buying NICs as part of a larger system, they may end up getting a discount, if not on the NIC then on the DRAM or storage. The economy hasn't exactly been kind to the DRAM and NAND flash market in recent quarters, which has driven average selling prices down. ®
Don't miss... Setting the stage for 1.6T Ethernet, and driving 800G now
From Chip War To Cloud War: The Next Frontier In Global Tech Competition
The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more
The High Stakes Of Tech Regulation: Security Risks And Market Dynamics
The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more
The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics
Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more
The Data Crunch In AI: Strategies For Sustainability
Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more
Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser
After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more
LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue
In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more