SK Hynix Has Probably Already Sold Most Of The HBM DRAM It Will Make Next Year

Korean chipmaker SK hynix has told investors the future looks bright thanks to strong demand for its memory products and early delivery of its first HBM4 samples.

The company yesterday staged its annual shareholders meeting and according to Korean media CEO Kwak Noh-jung used the occasion to reveal the company talks is close to wrapping up talks on sales of high bandwidth memory (HBM) for 2026. The looming end of such negotiations was seen as a sign that all the HBM SK hynix can make next year will be sold before a single chip passes through the factory gates.

The company managed the same feat this year, a reflection of the demand for HBM from the likes of Nvidia that rely on the fast memory to make their GPUs sing.

Also at the meeting, execs said recent weeks have seen a surge in orders in the hope they can beat the expected imposition of tariffs on semiconductors imported to the USA.

US president Donald Trump has said he will impose a 25 percent tariff on imported chips, and that the tax “will go very substantially higher over the course of a year.” He’s also mentioned tariffs at 50 and 100 percent.

The president has presaged a major announcement about tariffs on April 2nd.

Kwak reportedly also told the meeting SK hynix expects “explosive” growth in HBM sales. Asked whether Chinese firm DeepSeek’s apparent use of modest hardware to produce a powerful model represents a threat, he apparently opined that DeepSeek will spark wider adoption of AI that means more demand for SK hynix products from more buyers.

That’s become just about the standard response from leaders of companies that provide tech needed to run AI workloads when asked about whether an AI bubble is forming and what might happen if it bursts.

Some of the product the Korean chipmaker ships next year will likely be 12-layer HBM4, as the company last week said it has shipped the first samples of the product.

“The samples were delivered ahead of schedule based,” the company bragged, before saying it “aims to complete preparations for mass production of 12-layer HBM4 products within the second half of the year.”

The memory is designed to process more than two terabytes of data per second, which is more than 60 percent faster than its previous generation HBM3E could manage. ®

RECENT NEWS

From Chip War To Cloud War: The Next Frontier In Global Tech Competition

The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more

The High Stakes Of Tech Regulation: Security Risks And Market Dynamics

The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more

The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics

Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more

The Data Crunch In AI: Strategies For Sustainability

Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more

Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser

After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more

LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue

In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more