US Department Of Energy Solicits AMD's Help With Nuke Sims

AMD will join Intel in supporting Sandia National Lab's efforts to develop novel memory tech for use in Department of Energy (DoE) nuclear weapons simulations.

The contract, awarded under the Advanced Memory Technology (AMT) program, is funded by the DoE's National Nuclear Security Administration (NNSA) as part of its post-Exascale Computing Initiative.

The NNSA is the branch of the DoE responsible for maintaining and extending the lifespan and effectiveness of the US strategic arsenal. Since the American bans on atmospheric testing in 1963 and underground testing in 1996, US military research and development has relied heavily on supercomputers to simulate the destructive potential of nuclear weapons.

According to DoE scientists, many of these simulations benefit from improved memory performance because of the sheer number of parameters at play in nuclear explosions. According to ASC Program Director Thuc Hoang, emerging memory technologies have the potential to boost application performance by a factor of 40.

"We are pursuing memory bandwidth and latency improvements," explained James Laros, project lead at Sandia, in a statement. "If successful, this effort will positively affect both aspects of memory systems for our advanced and commodity technology platforms."

AMD isn't the first chipmaker the DoE has tapped to help it speed up its simulations. In December, the DoE awarded a similar contract to Intel, which ironically had canned its Optane memory division months earlier.

At the time, we learned that much of the research being conducted was into methods of boosting the performance of existing DDR memory. "Our goal with the AMT program is to change how DRAM is organized and help the DRAM vendors to design and deliver superior products," Intel's Josh Fryman previously told The Register.

However, it appears this isn't the only technology being explored. In Sandia's latest announcement, the lab highlighted advanced packaging techniques used by both Intel and AMD to attach memory closely to the CPU die. This has the benefit of reducing the access latencies and improving bandwidth – two of the goals highlighted by Laros.

Intel's recently announced Xeon Max CPU family certainly fits this description. The CPU features up to 64GB of HBM2e for a maximum of 1TB/sec of memory bandwidth feeding up to double-wide PCIe card 56 cores. The memory can either function as a standalone pool, or in conjunction with DDR5 memory.

AMD's X-series Epyc processors employ a similar approach using small, faster SRAM stacked atop the chip's core complex die. This memory expands the L3 cache to 96MB per die – for a total of 768MB of L3 cache on the company's top-tier Milan-X CPUs.

AMD is also working on a family of datacenter APUs beginning with the MI300. That chip, like Intel's Xeon Max, will use HBM – 128GB to be exact – that will be shared between a 24-core CPU and CDNA3 GPU. However, it's unclear whether these APUs will be used under the DoE program.

What we do know is that chipmakers will work with industry partners to shape the future of DRAM development as a direct result of this program. In December, Intel said it would contribute its findings back to JEDEC – the industry consortium that oversees DRAM standards.

So, while the DoE program may advance US nuclear strategy in the near term, it has the potential to benefit civilians too. ®

RECENT NEWS

From Chip War To Cloud War: The Next Frontier In Global Tech Competition

The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more

The High Stakes Of Tech Regulation: Security Risks And Market Dynamics

The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more

The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics

Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more

The Data Crunch In AI: Strategies For Sustainability

Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more

Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser

After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more

LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue

In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more