ChatGPT Slashes Energy Use, Sparks Debate

A new study by Epoch AI shows that ChatGPT uses far less energy than many experts once thought. The nonprofit research institute found that the chatbot’s latest model, GPT-4o, uses about 0.3 watt-hours for each query. Earlier reports suggested that a single query might use as much as 3 watt-hours. This news may seem small at first, but it raises big questions for the finance world and anyone who follows technology and sustainability.

Lower Energy, New Calculations

Epoch AI’s study has changed the view on how much energy ChatGPT uses. With the introduction of GPT-4o, each query now uses only 0.3 watt-hours. This new number is one-tenth of what many believed before. The drop in energy use comes from improvements in both the hardware and the software. For example, newer Nvidia H100 chips replace older models. These chips are faster and more efficient. In addition, researchers now use more realistic assumptions when calculating energy use. Instead of assuming full power is always used, they assume that servers work at about 70% of their full power.

This careful recalculation makes a strong case for the benefits of updating technology. In the world of finance, even small improvements in efficiency can lead to big savings over time. As companies look for ways to cut costs and reduce their carbon footprint, understanding real energy use is key.

Comparing to Everyday Energy Use

When you think about energy use, it can help to compare ChatGPT’s consumption with everyday tasks. At 0.3 watt-hours per query, ChatGPT uses about the same energy as a single Google search. To put this in perspective:

  • Boiling water for tea uses around 100 watt-hours.
  • Watching TV for one hour uses about 120 watt-hours.
  • Charging a new iPhone every day for a year takes roughly 4.7 watt-hours per day.
  • An average U.S. household uses about 29 kilowatt-hours (29,000 watt-hours) every day.

For many users, a single ChatGPT query is very light on energy. However, when millions of people use the tool each day, the total energy use becomes significant. For example, ChatGPT is estimated to use around 226.8 gigawatt-hours (GWh) of energy per year. This is enough to fully charge about 3.13 million electric vehicles. Such numbers remind us that even small amounts add up when scaled to a global level.

The Road to Greater Efficiency

Several factors help to explain why ChatGPT now uses less energy:

  • Advanced Hardware: The use of modern Nvidia H100 chips means the system can do more work with less energy. This is similar to how a new car model might use less fuel than an older one.
  • Optimised Software: The developers behind ChatGPT have improved the code and the way the system processes queries. This means it can give answers faster and with less waste.
  • Better Calculation Methods: The new research takes a more realistic look at how servers are used. Instead of assuming that servers run at full power all the time, the study uses a 70% utilisation rate. This adjustment gives a more accurate picture of energy use.

These changes show that technology can improve over time. But they also open up a debate. Some experts see these improvements as a sign that AI systems are becoming more sustainable. Others worry that, as AI becomes more popular, the overall energy use may still grow.

Environmental Impact and Business Concerns

For finance professionals, the environmental impact of new technology is a key concern. Many companies now look at sustainability when making investments and planning for the future. The new findings on ChatGPT offer both hope and caution.

On the positive side, the lower energy use per query means that each interaction is more efficient than once thought. This efficiency is good news for companies looking to reduce their energy bills and their impact on the environment. If similar improvements can be made in other areas, businesses may save money while also meeting stricter environmental standards.

However, the debate does not end there. The overall demand for AI services continues to grow. Even if each query uses very little energy, the total energy use could still be high. For instance, some scientists estimate that AI’s energy use could reach up to 134 terawatt-hours (TWh) by 2027. Data centres and networks already use about 1-1.5% of the world’s electricity. As more companies adopt AI for tasks like customer service, risk analysis, and market research, this percentage might rise.

The finance sector, which often leads in technology investments, may need to weigh these factors carefully. The promise of lower energy use must be balanced with the risk that overall energy consumption will continue to climb as more AI systems are deployed. This balance will be critical for meeting both business targets and environmental goals.

A Controversial Shift in View

The change in estimates is not without controversy. Many in the tech industry had accepted the higher energy figures as a fact of life. Now, the new study by Epoch AI challenges that belief. Some industry watchers believe that the new numbers are a sign of progress. They argue that as technology improves, we should expect energy use to fall.

Others remain skeptical. They point out that even small amounts of energy, when added up across millions of queries, can lead to large numbers. In other words, the total energy cost of using AI might still be very high. This view calls for continued attention to how data centres are powered and how AI systems are run. For the finance industry, which often makes long-term investments, such uncertainty can be a cause for concern.

There is also the matter of transparency. The methods used to calculate energy consumption need to be clear and reliable. If estimates are too optimistic, companies might underestimate the true costs of AI systems. On the other hand, more accurate methods can help investors and regulators make better decisions.

The Wider Impact on Markets and Policy

For those in finance, the efficiency of AI technology can affect market trends and policy decisions. Energy use is not just a technical detail—it has real financial and environmental consequences. Lower energy consumption can reduce operating costs for companies that use AI. It can also influence how investors view technology stocks and sustainability efforts.

At the same time, regulators and policy makers are watching these developments. As AI becomes more common, governments may impose stricter rules on energy use and emissions. Companies that lead in energy efficiency might have an edge when new regulations come into play. This can change the competitive landscape and affect investment decisions.

In the finance world, where risk and opportunity are closely linked, the debate over AI energy use is more than academic. It touches on issues of sustainability, cost, and long-term growth. The shift in ChatGPT’s energy use is a small part of a larger picture. As AI continues to grow, its impact on global energy demand will remain an important topic for discussion.

What’s Next

The debate over AI energy consumption is far from over. As technology improves and new models are developed, we can expect further changes in energy use. Finance professionals and industry leaders will need to stay alert to these shifts. They may have to rethink their investments in technology and energy infrastructure.

Will future models be even more efficient, or will rising demand outpace the gains in efficiency? This is a key question for both the tech and finance sectors. Meanwhile, the push for renewable energy and more efficient hardware is likely to continue. Companies that can combine lower energy use with strong performance might be in a strong position in the coming years.

Another area to watch is regulation. As governments become more aware of the environmental impact of technology, they may introduce new rules to encourage energy savings. This could change the way data centres are run and affect the costs of running AI systems. For investors, this is both a risk and an opportunity.

The shift in ChatGPT’s energy use also sparks a broader discussion about transparency in tech. More reliable data on energy use can lead to better decisions by companies, investors, and regulators alike. As this story develops, we will likely see more research into how to balance the benefits of AI with the need to protect our environment.

In summary, while the latest study shows that ChatGPT is more energy efficient than once thought, it also reminds us that small numbers can add up to a big impact when scaled globally. For finance professionals, this news is a call to keep a close eye on the evolving technology landscape. The choices made today in energy and technology will shape the market and our world for years to come.

RECENT NEWS

Texas Shakes Up U.S. Stock Markets

The New York Stock Exchange (NYSE) has announced a major change that is set to impact the financial markets globally. In... Read more

Nissan Honda Split: Merger Talks Collapse

On 13 February 2025, two of Japan’s biggest car makers, Nissan and Honda, made a major announcement that has sent ripp... Read more

Hybrid Workforces: AI Transforms Business Now

Marc Benioff, the CEO of Salesforce, has set the stage for a major change in how companies work. At the 2025 World Econo... Read more

Wildfires Spark Radical Rebuild And Reform

Southern California faces a long road ahead after the devastating Palisades and Eaton Fires of January 2025. These wildf... Read more

UK Fuels Bold Nuclear Future Growth

Prime Minister Keir Starmer has unveiled a bold plan to boost the United Kingdom’s nuclear power. This strategy aims t... Read more

Crypto Gains Ground For Daily Use

Cryptocurrency is shifting from niche technology to practical payment option, grabbing headlines in both corporate and c... Read more