AI uses a lot of energy. How Big Tech is trying to fix it
By 2027, global AI-related electricity consumption could increase by 64%, reaching a level on par with countries like Sweden and the Netherlands. Tech companies are largely driving this energy spike, as they rapidly scale data centers to power AI innovation. Amazon (AMZN), Google (GOOG, GOOGL), Meta (META), and Microsoft (MSFT) are projected to spend $189B in AI capital expenditures in 2024 alone.
This innovation boom comes with additional costs—placing strain on aging power grids and increasing companies’ emissions as they try to hit net-zero targets. Microsoft’s emissions have increased by 30% since 2020 because of its AI investments.
Companies are now working to keep up in the AI race while managing their energy footprint by investing in renewable energy sources and other zero-emission options, including nuclear. Amazon purchased a $650M data center next to a nuclear power plant run by Talen Energy (TLN) earlier this year. Chip companies that make the hardware powering AI, like industry leader Nvidia (NVDA), are also working to increase the energy efficiency of their products, as Big Tech continues to develop large language models and integrate AI into real-life applications.
Transcripción del vídeo
Artificial intelligence has been woven into the daily lives of consumers for decades, companies like Google, Apple and Tesla have built A I into our internet searches, wristwatches and our cars.
But as companies continue to advance the capabilities of artificial intelligence, A is insatiable appetite for energy comes with it.
That's because each A I task runs complex calculations through energy data centers running on electricity by 2027 global A I related, electricity consumption could increase by 64% up to 100 and 34 terawatt hours annually.
That's comparable to the electricity usage of countries like the Netherlands or Sweden.
So how are big tech companies addressing the energy demands?
Their future A I innovations will require more than half of Americans say they interact with artificial intelligence at least once a day.
According to Pew researcher and data scientist, Sasha Lucio's work is often cited in A I Energy studies.
She's a I and climate lead at Hugging face a platform that creates easy to use tools for building A I models.
There's A I in Google Maps, there's A I in smartphones and smart speakers every time you talk to Siri, that's speech to text recommendation engines.
You know who, what kind of song comes up next?
On Spotify.
Lucioni was part of a research team that evaluated the energy required for various A I use cases.
Let's take a chat GP T query.
For example, you type in a question that request is sent to a large data center where computers with powerful processors tap into huge amounts of data chat.
GP TSA I has learned an answer is then generated and sent back to the requester.
This process which can be done in as little as one second takes about 10 times more energy than an average Google search.
Generating a text response using A I 1000 times requires about 0.05 kilowatt hours of energy equal to watching 3.5 minutes of Netflix A I image generation requires more energy.
Creating 1000 images sucks up about three kilowatt hours equivalent to more than 3.5 hours of streaming.
Netflix training these large language models like chat GP T Microsoft's copilot or Google's Gemini requires even more energy than using them.
Training.
The GP T three model used about 1300 megawatt hours of electricity that's equal to watching Netflix for 100 and 85 years.
Training tends to only happen once and then once that model is done training, you have to make it available so people can query it and that's the inference phase.
But the thing is is that since the models get used so many times, it really adds up quickly.
It calculated that depending on the size of the model in um 50 to 200 million queries, you have as much energy usage for inference than for training.
Chad GP T gets 10 million users a day.
So within 20 days, you have reached that, you know, ginormous quote unquote amount of energy used for training via deploying the models.
The majority of these A I models are being funded by big tech companies with the capacity to scale A I efforts rapidly A K A hyper scale, Amazon Microsoft and Google are also using the most energy.
The hyper scalars who have the deep pockets are investing in different forms of energy in order to power their data centers because they need as much energy as possible.
Global 2023 data center energy consumption was up an estimated 105% since 2019.
Largely due to A I Microsoft alphabet meta and Amazon are projected to spend 100 and $89 billion in A I Capex.
In 2024 alone, Goldman Sachs projects that by 2030 global data center power demand will grow by 100 and 60% and could make up 8% of total electricity demand in the US up from 3% in 2022.
There's already a lot of stress put on our on our energy grids.
There's already a lot of demand and essentially A I is, is kind of making that worse, is aggravating it.
This as 70% of us transmission lines are approaching the end of their typical 50 to 80 year life cycle, increasing the likelihood of outages and cyber attacks.
According to the energy department, as you have more energy, not just because of A I, but because of electric vehicles, because of the use of more energy, you're going to have so much more of a demand on the grid and renewable energy sources aren't keeping up with increased amounts.
What we're seeing is that grid operators are actually choosing to keep coal powered plants online and active longer than they initially planned because they need to keep up with this demand.
And the thing is is that renewable energy is definitely growing but, but it can't keep up with the growth of A I which is almost exponential like both Microsoft and Google announced in recent months that they're not meeting their climate targets because of A I.
While renewables can't keep up with all of A I's energy demand.
They are still part of big tech strategy to balance innovation with sustainability.
In May of 2024 Microsoft signed the largest single corporate power purchasing agreement ever with Brookfield asset management and Brookfield renewable to deliver over 10.5 gigawatts of new renewable power capacity globally via wind solar and impactful carbon free energy generation technologies.
Amazon touts its title as the world's largest corporate purchaser of renewable energy for the fourth year in a row.
After a series of investments in wind and solar attributed to a portfolio big enough to power 7.2 million us homes each year.
The issue with renewables is that it is at certain times of the day.
It you have to also go into energy storage because you may, you may not be using that energy at that time of the day.
So the industry is trying to figure out.
Ok, well, where can we get renewable or net zero energy in an efficient and cheap enough way?
Nuclear is one of those clean energy sources emerging as a viable option.
Microsoft signed a deal with constellation to power 35% of one of its data centers with nuclear power.
While Amazon bought a $650 million nuclear powered data center from Talent Energy.
You have for example, Amazon which bought a nuclear plant in Pennsylvania so that it can place its data center right next to that nuclear power plant so it can just plug into that nuclear power to be able to power that data center that's massive, that hasn't been done before.
And increasingly you are going to see data centers wanting to co locate with nuclear power plants along with clean energy sources.
Big tech is also investing in more efficient hardware, the computer hardware that's being used for training a I models is actually kind of a recycling of hardware that was made for computer games.
Um They're called GP U so graphical processing units.
But now there are chips that are being made that are specifically catered towards training A I models.
So Google created a TPU chip which is called tensor processing unit, which is specifically for A I and no other applications.
Chip giant NVIDIA says the latest GP US can reduce A I model energy use and costs by up to 25 times compared to previous versions.
Lion said transparency is the key.
So companies quantify and factor in the A I energy footprint while managing innovation, we need more regulation, especially around transparency.
We're testing hundreds of models across different tasks and the goal is to develop these simple star ratings that can help developers can help users choose models that are more efficient depending on the task that they want to do by really benchmarking models and evaluating how much energy they consume.