Photo by Google DeepMind on Unsplash
News of ChatGPT has been everywhere in recent months, from its potential to be used by students to write essays to its use to spread misinformation. Artificial intelligence (AI) models like ChatGPT also have the potential to help us process vast amounts of climate data and find new approaches to solving the climate crisis. Just this week, Professors at the University of Zurich announced the launch of chatIPCC, a chatbot created to provide accurate and reliable information on climate change, drawing from the robust and comprehensive scientific literature assessed by the IPCC.
However it is used, though, AI requires vast amounts of processing power, particularly to train a new model, which means high levels of energy use and associated emissions.
Estimating AI’s carbon footprint
AI is a relatively new kid on the block and we still do not know a lot about its carbon impact as a sector. Individual companies have tried to estimate the impact of their models, giving us a sense of just how power-hungry they can be.
Recent research by Hugging Face into the footprint of its BLOOM model (a rival to ChatGPT) estimated that training its model resulted in 25 tonnes of emissions, with roughly a further 25 tonnes coming from the production of the computer hardware needed to build and train the model and the energy required to actually run BLOOM once it was trained. This is roughly equivalent to 60 flights between London and New York, according to the MIT Technology Review.
Once fully trained, the impact drops quite significantly – to around 19kg CO2e per day – as the level of processing is far less intense. However, models will need to be regularly retrained and refreshed to stay up to date. A model built in 2019 would not know about the Covid-19 pandemic, for example, and without access to the latest information, would make erroneous assumptions.
Emissions vary between AI models
AI models are not all created equal. BLOOM was trained on a French supercomputer powered mainly by nuclear energy. Consequently, its footprint may be as little as one-tenth of the footprint of rival models. Hugging Face estimates that training ChatGPT produced 500 tonnes of emissions as it was trained on older, more energy-intensive hardware based in another country with fewer low-carbon energy sources. Meanwhile, emissions from Meta’s OPT may be closer to the region of 75 tonnes based on the technology it is known to use, the company estimates. There is no data on their energy consumption once the training is finished and they enter use.
As technology becomes ever more efficient and countries move towards cleaner energy, the impact of AI could come down. However, the amount of data we want to process is increasing at a phenomenal rate. GPT-3, a sister model to ChatGPT (also developed by OpenAI), works with 175 billion variables. The previous version used just 1.5 billion.
AI and climate modelling
Some of this data could be hugely beneficial. AI can help us strengthen our climate modelling, allowing us to make better predictions about how we decarbonise different sectors and the impact of going faster or slower. It can also help us to predict weather patterns and the chance of extreme weather events more accurately. Professor Felix Creutzig of the Mercator Research Institute on Global Commons and Climate Change in Berlin suggests it could also help us to model heat islands in our cities and identify where the addition of tree cover or changes to building design can help to bring temperatures down.
Tough choices ahead
We still do not have a full understanding of how much energy AI models consume and how many emissions they produce. However, it is already becoming clear that we need to start thinking about some of the possible trade-offs to be made. It may be worth using enormous amounts of power to build a new climate model providing us with the data we need to act to reduce emissions; we may not feel that processing other forms of data, e.g. to continually improve the quality of recommendations on shopping or streaming platforms, is as readily justifiable. In the latter case, we may feel it is far better to continue to improve the models that we already have.
There are also suggestions that we need to think about where and how those models are run, not just to make use of the most efficient technology, or the lowest-carbon forms of energy, but maybe also at times when demand for power is lowest.
AI does not have to have an ever-increasing impact; its footprint could come down and its positive applications blossom. We do, however, need to continually seek to understand more about its footprint, so that major technology companies are compelled to find ways to reduce that footprint.