Despite Wall Street Jitters, AI Hopefuls Keep Spending Billions On AI Infrastructure
Comment Despite persistent worries that vast spending on AI infrastructure may not pay for itself, cloud providers, hyperscalers, and datacenter operators have continued to shovel billions of dollars into ever-larger GPU clusters.
Those who worry the world is spending too much, too fast on AI usually say there is thin evidence of machine-learning investments in this LLM-era producing substantial revenue or profit, and point out cautious corporate adoption. They also highlight DeepSeek’s claims that it could have trained its V3 model in the cloud for low millions of dollars, thanks to its efficient design, news that saw the value of AI-centric stocks slump. In reality, the Chinese lab spent a pretty penny on its own on-prem cluster of GPUs to build the model, though it still appears a more lightweight operation than its Western rivals while being roughly as capable.
That all said, plenty of investors remain optimistic.
OpenAI's $500 billion Stargate tie up with SoftBank, Oracle, MGX, and others was a massive vote of confidence in demand for AI infrastructure.
In the weeks following the mega project's announcement, we've seen a flurry of fresh investment in AI chip startups, datacenters that house GPUs, and model-making companies.
On Monday, Chinese e-commerce and cloud titan Alibaba announced plans to invest 380 billion Yuan (about $53 billion) in cloud and AI infrastructure over the next three years to fuel the development of artificial general intelligence AGI.
Alongside its investment plans, Alibaba rolled out a bigger more capable "thinking" model to challenge DeepSeek's R1, OpenAI's o3-mini, and Anthropic's new Claude Sonnet 3.7, which also launched Monday.
Speaking of Anthropic, the Wall Street Journal reports the startup is currently in the process of finalizing a $3.5 billion funding round which would value it at $61.5 billion.
Monday's announcements follow a flurry of investment in smaller GPU neo-cloud providers, which have specialized building AI training grounds containing tens of thousands of accelerators available to rent at cut-rate prices — so long as you commit to a long-term contract.
Last week, GPU cloud provider Lambda won a $480 million series-D funding round. The money will be used to pack its data halls with Nvidia's latest GPUs. The funding round means the cloud player has now raised $1.4 billion.
Just a day later, AI cloud services startup Together AI walked off with $305 million from the likes of General Catalyst and Prosperity7.
In a release, Together AI claimed it'd secured 200 megawatts of datacenter capacity which it intends to fill with yet more of Nvidia's flashy new Blackwell GPUs. Earlier this month the startup announced availability of Nvidia's B200-based systems and is working to deploy a cluster of 36,000 GB200 GPUs in partnership with Hypertech.
Apple is also pushing ahead with its plans to pack its datacenters with AI servers powered by its custom silicon as part of a four-year $500 billion commitment to bolster US manufacturing and R&D. Cupertino’s AI machines are designed to support the iGiant's AI-infused software experiences by offloading workloads deemed too intense to run locally on iDevices.
- Apple promises to spend $500B, hire 20K over next 4 years to swerve Trump tariffs
- US Dept of Housing screens sabotaged to show deepfake of Trump sucking Elon's toes
- Intel cranks up accelerators in Xeon 6 blitz to outgun AMD
- Microsoft's drawback on datacenter investment may signal AI demand concerns
The news isn’t all good. As we recently reported, Microsoft has reportedly walked away from several high-capacity datacenter leases, sowing fear among investors that one of the AI boom's biggest cheerleaders may have overestimated demand. That's despite Microsoft CEO Satya Nadella's comments a few weeks back that he's good for his $80 billion contribution to the Stargate project.
DeepSeek also continues to worry many, but American tech players, including Google DeepMind's Demis Hassabis have since called many of the Chinese company’s claims into question.
Microsoft’s Nadella and Meta chief Mark Zuckerberg have continued to insist that additional compute infrastructure is essential, both to power the inferencing workloads that put models to work and to push for artificial general intelligence.
This week will bring another major data point when Nvidia announces its quarterly earnings, giving investors a chance to assuage their fears or justify their unease. ®
From Chip War To Cloud War: The Next Frontier In Global Tech Competition
The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more
The High Stakes Of Tech Regulation: Security Risks And Market Dynamics
The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more
The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics
Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more
The Data Crunch In AI: Strategies For Sustainability
Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more
Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser
After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more
LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue
In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more