AI: The Surprising Key to Reducing Global Energy Consumption

September 15, 2024 by Charles
Business & Leadership
,
Renewable Energy

What if I told you that the same AI technology many fear for its energy consumption could actually be our greatest tool in reducing it?

Everywhere you look, there are headlines warning about the immense power demands of AI, linking it to climate concerns and portraying it as a new threat to our planet’s future. But what if these fears are missing a crucial point? In reality, AI has the potential to not only revolutionize industries but also to slash global energy use in ways we’ve never seen before.

In this article, I challenge the prevailing narrative and explore how AI might just be the unexpected ally we need in the fight for a more sustainable world.

The energy consumption of AI

I’m captivated by the rapid rise of AI tools and technologies, and I use them almost daily. Whether it’s checking the grammar of this article, finding a recipe for black beans, or troubleshooting my phone, AI has become an indispensable part of my life.

With the emergence of tools like ChatGPT and other language models, there have been countless articles raising alarms about the energy consumption of AI. Some of these pieces paint a dire picture, linking AI’s energy use to the growing threat of climate change.

To understand AI’s impact on energy consumption, let’s look at how much energy AI models actually use. I’ll provide some real-world examples to put these figures into context.

Take GPT-3, a language model developed by OpenAI, which has 175 billion parameters and 96 layers. Training GPT-3 required a whopping 3,640 petaflops/s-days, using a supercomputer with over 285,000 CPU cores and 10,000 GPUs. The total energy consumption for training this single AI model was around 1,248 megawatt-hours (MWh), which is enough to power a town of 1,000 homes for a month.

In contrast, running or “querying” the model (inference) is less energy-intensive than training but still meaningful.

For instance, a single query to ChatGPT is estimated to use about 0.3 kilowatt-hours (kWh), compared to just 0.0003 kWh for a Google Search—making it 1,000 times more energy-intensive. 0.3 kWh is the same amount of energy used by an appliance, like a microwave or hair dryer, running for 18 minutes.

Across the industry, companies are investing billions of dollars in building AI data centers. Nvidia’s market capitalization recently surpassed $3 trillion due to the demand for its AI chips. One example of this is Tesla, which is constructing a massive supercomputing cluster at its Gigafactory in Texas. This facility, equipped with 50,000 Nvidia H100 chips and 20,000 of Tesla’s own chips, will primarily be used to train its full self-driving AI model. The data center is expected to consume around 130 megawatts (MW) of power this year, with plans to scale up to over 500 MW in the next 18 months.

To power the 3.76 million GPU annual production from Nvidia, we will need more than 14 billion kWh of additional power generation. At first glance, this number might seem alarming, especially without understanding the broader context—something I’ll delve into next.

Powering AI

Now that we’ve explored the demand side of AI, let’s look at the supply. AI is emerging as a new driver of global electricity demand, so it makes sense to consider how this demand will be met. Given that renewable energy is now the cheapest form of new power generation worldwide, it’s reasonable to expect that much of AI’s energy needs will be supplied by renewable sources.

So, what does it take to generate the 14 billion kWh of power needed annually to run all those Nvidia AI processors?

Let’s break it down using wind power as an example. With a 40% capacity factor—roughly the average for new wind installations—we would need to add about 4 gigawatts (GW) of wind power to meet this demand.

Now, to put that into perspective: 4 GW of wind energy is only about 0.4% of the current global installed wind capacity. And keep in mind, we’re already adding 80-90 GW of new wind power to the grid each year globally.

Of course, this simplified analysis doesn’t account for factors like intermittency and grid limitations, which are genuine challenges. But these are challenges faced by all grids worldwide as they shift towards more renewable energy. AI isn’t about to cause a sudden and drastic spike in global energy demand—at least not anytime soon.

Optimising AI Compute

Let’s take a closer look at how AI computing is being deployed today and what it might look like in the future. Right now, much of AI compute takes place in large, centralized data centers. These facilities are designed for maximum efficiency, with engineering that enhances processing power while minimizing cooling needs. They are often situated in regions with low power costs, which frequently coincide with areas that generate significant renewable energy.

But there’s another approach: distributed AI compute. For example, platforms like Nosana enable AI processing tasks to run on a decentralized network of GPUs. This method takes advantage of underutilized GPUs in consumer PCs or smaller, distributed data centers, allowing them to be switched on or off as needed. In my case, I run a node on my gaming PC at home, making use of my Nvidia GPU, which would otherwise sit idle 95% of the time.

Using distributed compute can also unlock “stranded” renewable power resources by operating in areas where the grid has bottlenecks. We’ve seen this happening in places like Texas, USA.

Moreover, AI processors are becoming more efficient over time. For example, the Nvidia H100 processor offers three times better performance per watt than its predecessor, the A100, meaning it can accomplish three times more work for the same amount of energy.

On top of that, companies are continually optimizing their software—often with the help of AI itself—to run models more efficiently. This is why you see various AI models from OpenAI, each with different operational costs (which reflects their varying energy consumption).

Lastly, older processors are gradually being replaced with more efficient ones, further driving down the energy required for AI computing.

The impact of AI

All technology that humankind develops is inherently deflationary—it reduces the cost of performing a task. From the invention of the plough to the advent of cotton mills, cars, and computers, every breakthrough has lowered the cost and total energy required to get work done.

AI is no different and, in fact, could have an even more rapid and profound impact than we can currently imagine.

Let’s consider a basic economic argument:

At its core, Cost = Energy Consumption.

Whenever something costs money, you can trace it back to the amount of energy used—whether that energy comes from extracting raw materials, the labor of people involved, or the fuel required to transport goods. (This logic only breaks down when other factors come into play, such as government intervention or psychological pricing biases like brand perception.)

AI is being driven forward by commercial enterprises that get paid to deliver services that make things faster, easier, and more efficient. In essence, they are paid to reduce costs—and ultimately, this means they must reduce the overall energy consumption of the systems they operate in. If they don’t, they will fail to deliver value.

Take Tesla, for example. At its Gigafactory in Texas, Tesla is training the next generations of its full self-driving software and the AI for its Optimus humanoid robots. The outcomes of these AI-driven advancements will likely be transformative: fewer cars on the road due to the rise of Robotaxis, and robots capable of running factories 24/7, which will dramatically lower operational costs.

AI, in essence, holds the potential to reduce the energy footprint of countless processes and industries.

Conclusion

To me, it’s clear that AI will enhance the world we live in. It will optimize every aspect of our lives, reduce energy consumption, help combat climate change, and accelerate the deployment of renewable energy sources.

I’m optimistic about the future and excited to see how quickly these changes will unfold.

Join the conversation—what’s your take on AI’s role in reducing energy consumption?

  1. Great article- I think about places like the Texas grid (ERCOT) in summer (and sometimes when a winter storm hits) there is inevitable strain on the grid causing outages. It would be interesting to consider a large data center as almost like an ancillary service where it wouldn’t produce energy but could load shed to support the grid. I wonder if during peak pricing it trumps the benefit of running? Also interesting point about factory efficiency going full or near full robotics- again I think of this in the context at least in Texas energy is plentiful at night thereby having a great synergy. Each electricity grid is certainly different but examining the fuel resources to match to production assets and then considering the type of time-of-day load and load growth is where I think the systems need more attention particularly in systems with a significant peak shape which is really inefficient overall. In any event AI is here to stay so the faster industries/people adapt the better off they will be.

  2. People don’t understand that AI earns no salary, doesn’t consume and doesn’t pay taxes.

    Each piece of work we give it instead of giving it to a human contributes destroying education, hospitals… (all public services) and middle class.

    AI is a worlwide money transfer from consuming to GAFAM’s tax heavens

    So far, every company that implemented it didnt increase salaries but had massive layoffs instead

Leave a Reply

Other Categories