background image

AI and climate: the good, the bad and the ugly

author image

By Christopher Caldwell

· 5 min read


What is AI for?

Will it cure cancer, find God, or usher in the apocalypse? 

Perhaps it can just get us that Ratatouille recipe without having to scroll through the chef’s entire life story.

In the culture today, AI seems to be the answer to everything. But all that hype produces more heat than light for the discerning leader. Assessing what AI means for something as complex as climate change requires careful separation of the costs and benefits – and keeping an eye on the philosophical impacts too.

So here is my take on what AI means for climate: the good, the bad and the ugly. 

(And no, I didn’t get ChatGPT to write this for me.)

The good

By accident or design, technology has become an indispensable part of our struggle against climate change. AI can at least help us supercharge it. 

To start with, AI helps businesses drowning in data. As the decarbonisation revolution spreads across the world, it sucks in ever more unstructured information; machine learning can turn this noise into actionable insight. For example, PrevisIA combs satellite images to predict deforestation targets in the Amazon with remarkable accuracy. 

Optimising systems is another use case. Energy efficiency sadly remains the ugly duckling of mitigation efforts, but is an ideal task for AI. Decentralised grid management is a great example, as we move from a few dozen power plants to thousands of smaller assets (and millions of EV batteries for storage.) Octopus Energy’s Kraken software is already blazing a trail here.

Finally, we can optimise next-gen climatetech before it even exists. Innovation is an iterative process, and AI can collapse design cycles remarkably effectively. One Stanford-Toyota team used machine learning to cut EV battery testing time (a major design bottleneck) by 98%. 

The bad

However magical AI appears on the surface, the work it does under the hood is carbon-intensive. It’s hard to find solid data, but some estimate that training a ChatGPT instance emits 500 tonnes of CO2e, and its daily footprint was estimated as equivalent to 175,000 people in January 2023 (and visits have grown 3x since then). 

This shouldn’t come as a great surprise. The internet already accounts for 2-4% of global emissions (ahead of aviation) and is growing fast. Data is doubling every two years, even as we are starting to run out of headroom to make efficiency improvements at the transistor level, and an estimated 75 billion devices will have joined the internet-of-things by 2025. All that new data will also make teaching AI ever more intensive – total training computations increased by a factor of 300,000 (!) between 2012 and 2018. 

The ugly

More pernicious still are the philosophical risks.

First, beware of the false sense of omniscience. Today’s LLMs remain the stochastic parrots of computing, task-bound and trained on derivative data; and forecasting and optimisation aren’t exactly new functions (thank you, linear regression!) AI is impressive, and it may make us feel that we can suddenly see everything, but we mustn’t get lost in the hype. 

Inconveniently for LLMs, climate change means that tomorrow isn’t going to look like yesterday – and the transformation will be chaotic. We should be wary of relying upon backward-looking AI (trained as it is on historic data) to forecast a deeply uncertain future. The insurance industry in places like Florida stands as a cautionary tale of the effects of climate change on historic modelling.

Secondly, we risk reinforcing the cult of the technofix. ‘AI will save us,’ is another way to abdicate responsibility, and instead lean on technology as an excuse not to change the behaviours and power structures that cause climate change. We don’t need a geoengineering solution for our executive function. 

Asking the right question

Ultimately, AI is neither the saviour nor destroyer of humanity because it isn’t really an external force; it is a mirror reflecting society’s existing systems and choices. After all, AI deployment will simply follow the money, and it only answers the questions we ask.

As long as we continue to ignore climate change, AI will help us do that. ‘Oil in the Cloud,’ an investigative report by Greenpeace, found Big Tech selling advanced AI to oil and gas companies. They were using it to help them extract the dirtiest fossil fuels in some of the world’s most vulnerable places, from the Permian to the Arctic. Oil firms spent some $2.5bn on AI in 2020 (estimated to increase to $15bn by 2030) on the basis that it could boost their production by as much as 5%. But the problem here isn’t the AI; it’s dangerous corporations and the political system that continues to support them. 

On the other hand, my own company is trialing AI to help choose the best locations for solar panel installations, accounting for weather, topography, and so on. We’re excited by its potential to boost ops efficiency for renewables – as one tool amongst many. Still, if we get serious about decarbonisation then AI can help us with that too.

Good or bad, AI is a mirror of society. It can only reflect rather than create our climate reality; and will prove as ugly – or as beautiful – as we are.

illuminem Voices is a democratic space presenting the thoughts and opinions of leading Sustainability & Energy writers, their opinions do not necessarily represent those of illuminem.

Did you enjoy this illuminem voice? Support us by sharing this article!
author photo

About the author

Christopher Caldwell is the CEO of United Renewables, where he employs his past experiences as a corporate lawyer, investment banker, and team leader to lead all aspects of the business. Chris holds a degree in business from Trinity College Dublin, an MBA from London Business School, and is currently reading part-time at the Yale Center for Business & the Environment. 

Other illuminem Voices


Related Posts


You cannot miss it!

Weekly. Free. Your Top 10 Sustainability & Energy Posts.

You can unsubscribe at any time (read our privacy policy)