Can the climate survive the insatiable energy demands of the AI ​​arms race? | Technology sector


The rise of artificial intelligence has driven share prices of big tech companies to new highs, but at the expense of the sector’s climate ambitions.

Google admitted on Tuesday that the technology is threatening its environmental goals after revealing that data centers, a key piece of AI infrastructure, had helped drive its greenhouse gas emissions up 48% since 2019. It said “significant uncertainty” around meeting its 2030 net-zero emissions goal — reducing the total amount of CO2 emissions it is responsible for to zero — included “uncertainty around the future environmental impact of AI, which is complex and difficult to predict.”

So can technology reduce the environmental cost of AI, or will the industry forge ahead anyway because the prize for supremacy is so great?


Why AI poses a threat to tech companies’ green goals

Data centers are a critical component for training and operating AI models like Google’s Gemini or OpenAI’s GPT-4. They contain the sophisticated computing equipment, or servers, that process the massive amounts of data that underpin AI systems. They require vast amounts of electricity to operate, which generates CO2 depending on the energy source, as well as creating “embedded” CO2 from the cost of manufacturing and transporting the necessary equipment.

According to the International Energy Agency, total electricity consumption by data centres could double from 2022 levels to 1,000 TWh (terawatt hours) by 2026 – equivalent to the energy demand of Japan – while research firm SemiAnalysis estimates that AI will result in data centres using 4.5% of global power generation by 2030. Water use is also significant, with one study estimating that AI could account for up to 6.6 billion cubic metres of water use by 2027 – almost two-thirds of England’s annual consumption.


What do experts say about environmental impact?

A recent UK government-backed report on AI safety claims that the carbon intensity of the energy source used by tech companies is “a key variable” in calculating the environmental cost of the technology. However, it adds that a “significant portion” of AI model training still relies on energy powered by fossil fuels.

In fact, tech companies are snapping up renewable energy contracts in an attempt to meet their environmental goals. Amazon, for example, is the world’s largest corporate buyer of renewable energy. However, some experts argue that this pushes other energy users to turn to fossil fuels because there is not enough clean energy to go around.

“Not only is energy consumption growing, but Google is also struggling to meet this increased demand for sustainable energy sources,” says Alex de Vries, founder of Digiconomist, a website that monitors the environmental impact of new technologies.


Is there enough renewable energy for everyone?

Governments around the world plan to triple the world’s renewable energy resources by the end of the decade in order to cut fossil fuel consumption in line with climate goals. But the ambitious pledge, agreed at last year’s COP28 climate talks, is already in doubt, with experts fearing a sharp rise in power demand from AI data centres could push it further out of reach.

The IEA, the global energy watchdog, has warned that while global renewable energy capacity is set to grow at the fastest pace in 20 years by 2023, the world could only double its renewable energy by 2030 under current government plans.

The answer to AI’s energy appetite may be for tech companies to invest more in building new renewable energy projects to meet their growing energy demand.


How soon will we be able to build new renewable energy projects?

Onshore renewable energy projects such as wind and solar farms are relatively quick to build – they can take less than six months to complete. However, slow planning standards in many developed countries, coupled with a global stagnation in connecting new projects to the grid, could drag out the process for years. Offshore wind farms and hydropower projects face similar challenges, with construction times of between two and five years.

This has raised questions about whether renewable energy can keep pace with the expansion of AI. Big tech companies have already tapped a third of US nuclear power plants to supply low-carbon electricity to their data centers, according to the Wall Street Journal. But without investment in new energy sources, these deals would divert low-carbon electricity from other users, leading to more fossil fuel consumption to meet overall demand.


Will AI’s demand for electricity continue to grow forever?

Normal rules of supply and demand would suggest that as AI uses more electricity, the cost of energy rises and the industry is forced to economize. But the unique nature of the industry means that the world’s largest companies may instead decide to bear the spikes in the cost of electricity, burning through billions of dollars as a result.

The largest and most expensive data centers in the AI ​​sector are those used to train cutting-edge AI, systems like GPT-4o and Claude 3.5, which are more powerful and capable than anything else. The leader in the field has changed over the years, but OpenAI is usually near the top, jockeying for position with Anthropic, the maker of Claude, and Google’s Gemini.

Today, “frontier” competition is thought of as winner-take-all, with very few obstacles preventing customers from switching to the latest leader. That means that if one company spends $100 million on a training program for a new AI system, its competitors have to decide whether to spend even more or drop out of the race altogether.

Worse, the race for so-called “AGI” — AI systems capable of doing everything a person can do — means it might be worth spending hundreds of billions of dollars on a single training test, if it would lead your company to monopolize a technology that could, as OpenAI puts it, “elevate humanity.”


Won’t AI companies learn to use less electricity?

Every month, new advances in AI technology are being made that allow companies to do more with less. In March 2022, for example, a DeepMind project called Chinchilla showed researchers how to train cutting-edge AI models using radically less computing power, by modifying the relationship between the amount of training data and the size of the resulting model.

But that didn’t result in the same AI systems using less electricity; instead, it resulted in the same amount of electricity being used to create even better AI systems. In economics, that phenomenon is known as the “Jevons paradox,” named after the economist who observed that James Watt’s improvement of the steam engine, which allowed much less coal to be used, instead led to a huge increase in the amount of fossil fuel burned in England. As the price of steam power plummeted following Watt’s invention, new uses were discovered that wouldn’t have been worthwhile when the power was expensive.



Source link

Leave a Comment