Cut The DRAM Waste, Cut The Energy Bill

Sponsored Feature Computers are taking over our daily tasks. For big tech, this means an increase in IT workloads and an expansion of advanced use cases in areas like artificial intelligence and machine learning (AI/ML), the Internet of Things (IoT), augmented reality and virtual reality (AR/VR).

And at the end-user level, it's resulted in more dependence on mobile devices and PCs to get things done and an explosion in the amount of data created and consumed.

By 2025, it's estimated that the world will create and consume over 180 zettabytes of data, representing astronomical growth from the 2021 growth projection. This ever expanding volume of digital information requires a large number of datacenters to help centralize organizations' IT operations and host the infrastructure equipment needed to store, process and disseminate big data and applications.

While datacenters will play a critical role, they also burn up substantial amounts of energy which can have an adverse effect on the planet's ecosystem. To put that into perspective, datacenters currently account for up to 3 percent of global energy consumption, a number predicted to rise to 4 percent by 2030. And with a recent McKinsey report highlighting the rapidly accelerating demand for yet more datacenters, there is every possibility this number will actually increase by an even bigger margin in the coming years.

That's a situation which leaves datacenter owner/operators scrambling to get their hands on new technologies that will help reduce their power usage – not only to meet environmental targets but also to help cut the spiraling cost of application and service hosting provision. And from a different angle, governments worldwide are reeling out new and updated regulations deliberately designed to help reduce the growing environmental impact of datacenters. A typical case is the Government of Ireland's regulations to facilitate sustainable datacenter development that adheres to efficient energy use, for example.

The race for the greener datacenter

So, how can datacenters reduce energy consumption while still providing enough compute power and data storage capacity to support workload innovation whilst simultaneously meeting a commitments to deliver and maintain high quality services for their business and consumer customers? Some of the current efforts being made include establishing renewable energy sources on-site for datacenters and utilizing advanced cooling systems, such as evaporative cooling and liquid cooling, to reduce power consumption. Amazon and other big techs have spent considerable time and resources on this path already by committing to long-term contracts that will see them purchase renewable energy sources to power their datacenter.

But while these efforts are praiseworthy, they haven't significantly lowered energy usage. Which means the industry as a whole – both datacenter providers and chip manufacturers - has to think harder about the technologies it deploys and reconsider the factors responsible for driving up datacenter energy consumption in the first place. One of these is bottlenecks in the CPU and GPU.

Because data jams in these components can result in underutilized servers and memory spaces that consume substantial amounts of energy, imposing unnecessary costs on the business. For instance, a CPU/GPU memory bottleneck can lead to delayed access to the data required, which has the knock on effect of increasing the time spent for that data to be transferred from memory and the time it takes for the CPU and GPU to process it. The result is a brake on the performance of the server, and therefore the application. In datacenter architectures which typically run multiple applications simultaneously, this type of bottleneck ends up causing significant delays and reducing the overall efficiency of the entire hosting facility. And that in turn increases the amount of processing power required to complete any given task and drives up energy consumption.

Addressing this issue isn't easy, but it has prompted some system designers and chip manufacturers to research and develop new technology that can double the main memory capacity and bandwidth of high-performance servers and smart devices. One of those is ZeroPoint Technologies.

The role of memory compression techniques

ZeroPoint Technologies originated as an offshoot of Chalmers University of Technology in Gothenburg, Sweden. Since its 2016 founding, ZeroPoint Technologies has raised over $7.5 million from earlier investments from Chalmers Ventures and recently received an additional $3.5 million in a seed round from climate tech investor Climentum Capital, Nordic VC and Industrifonden to aid the company's development in 2023 and beyond.

The company's mission is to figure out how to expand the main memory capacity in semiconductors to improve the utilization of existing DRAM.

After fifteen years of dedicated research, the company's co-founders - Professor Per Stenström and Dr. Angelos Arelakis - discovered a memory compression technique that not only removes unnecessary information from the main memory of computers, but also lowers its energy use. This breakthrough led to the formation of ZeroPoint Technologies in 2016 and birthed the solution known today as ZeroPoint technology.

ZeroPoint technology can be integrated at several different places in the memory hierarchy on the chip, adding significant benefits to on-chip SRAM as well as off chip DRAM. It's delivered as a hardware IP block that customers integrate on a CPU or System on Chip (SoC) supplemented by customized software to enable transparent integration.

The solution is designed to improve memory bandwidth and capacity through the use of a tripartite technique which starts with a unique data compression algorithm that can compress data with extremely low latency – a typical 64-byte block can be compressed in just a few nanoseconds. The second stage involves real-time data compaction algorithms that translate compression into bandwidth savings at nanosecond speeds. The third approach focuses on managing memory layout to increase DRAM bandwidth and capacity.

These three techniques aim to address one of the root causes of performance overload in semiconductors that drive up energy use. ZeroPoint Technologies CEO Klas Moreau suggests that a typical server memory contains about 70 percent unnecessary information that occupies space in the main memory. So eradicating that will have the dual impact of expanding the memory bandwidth and improving performance per watt. The ultimate aim is to remove all that waste in the semiconductors by combining ultra-fast data compression with real-time data compaction and transparent memory management in order to cut down energy consumption per server.

The differentiator under the hood

To achieve this, Ziptilion, one of the products based on ZeroPoint technology, comprises several components, including a write buffer for outgoing data, a compressor, a decompressor, a prefetch buffer, and an address translator. The write buffer is responsible for storing uncompressed data that is waiting to be written back. When the IP block reads data, it decompresses the retrieved data and places it in the prefetch buffer. Due to the compression, a DRAM block that normally holds one cache line can now hold multiple cache lines.

By keeping these lines in the prefetch buffer, future requests for those lines can be processed without needing a further read from DRAM. This leads to an increase in overall bandwidth, which is primarily due to the reduced need for DRAM reads rather than the smaller data size. As a result, there is a reduction in the amount of memory required to store the data and the amount of time and energy needed to transfer it.

One other key feature of ZeroPoint technology is that it is transparent to the operating system and application. This, says the company, is one of the differentiating factors between ZeroPoint and other memory expansion solutions in the market.

ZeroPoint's capacity to raise DRAM bandwidth and improve power use in semiconductors was recently covered in a microprocessor report by Tech Insights, one of the leading information sources for the semiconductor market. The report showed that Ziptilion's capacity to take compression to local DRAM could improve performance, reduce power use and increase speedy access to memory by over 25 percent.

The implication is that the integration of ZeroPoint technology in semiconductors can help datacenters and the mobile device industry solve the problem of underutilization of operating expense (opex) and capital expenditure (capex) investments. For the opex, ZeroPoint technology makes it possible to get more performance per watt out of the electricity invested, and for the capex, it creates an opportunity to get more memory capacity and bandwidth.

And that's an attractive proposition for any datacenter owner/operator.

Sponsored by ZeroPoint Technologies.

RECENT NEWS

From Chip War To Cloud War: The Next Frontier In Global Tech Competition

The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more

The High Stakes Of Tech Regulation: Security Risks And Market Dynamics

The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more

The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics

Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more

The Data Crunch In AI: Strategies For Sustainability

Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more

Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser

After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more

LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue

In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more