Simon Hunt

A crackdown on AI energy consumption would be a mistake

The logo for OpenAI (Credit: Getty images)

AI has an energy problem: it consumes an awful lot of it. Firms like ChatGPT creator OpenAI demand eye-watering levels of energy to develop their models. Training Chat GPT-3 used as much as 120 American homes over the course of a year, according to one study, while the training of GPT-4 used many multiples more.

The International Energy Agency estimates that AI will cause the number of data centres, the warehouses in which their vast training datasets are stored, to double globally at some point in the next ten years. Inevitably, this will also lead to a doubling in the amount of electricity needed to run them. There have been warnings that coal-fired power stations may have to stay open for longer to keep pace with demand, and there are concerns over the vast quantities of water used to keep the systems cool.

There are plenty of smaller AI businesses set up with the expressed purpose of cutting emissions

None of this sounds encouraging. The scale of the emissions produced by this nascent industry is gradually creeping up the political agenda. ‘Without immediate efforts to integrate climate and environmental justice into AI policy and incorporate input from frontline communities, AI will only exacerbate environmental injustice,’ the Friends of the Earth warned in a report published in March, as it called for more transparent reporting on emissions.

When OpenAI launched the latest version of ChatGPT earlier this week, the BBC branded its environmental impact ‘the elephant in the room’. The company was critiqued for the fact that ‘there was no mention of sustainability’ at the launch event.

In light of this growing pressure, AI businesses could soon face a crackdown on their energy consumption. Or at least, a barrage of fresh regulation on how they use energy, report on it and seek to mitigate the damage.

That would be bad news – for both the businesses themselves and the wider economy. Cracking down on AI companies in this way would be hugely misguided.

Calculations of how much raw energy the computers needed by big AI businesses consume often overlook the combined efficiency savings they make through the huge range of tasks they can replace. In his book The Technological Singularity, Imperial professor Murray Shanahan cites the example of a motorbike manufacturer using AI to design its next model. Historically this would have involved the firm commissioning a big team to research, build and test a whole range of prototypes before presenting a shortlist to management – the process would take months. Soon it could take no more than a few minutes.

The same is true for a host of industries where complex time and energy-consuming tasks can be completed in a flash. Add those all up and by comparison, AI’s drain on resources is small.

Right now, though, that is not true for everything made by AI. Large language models (LLM) like ChatGPT can be way more inefficient at doing basic things like producing a string of text. One study found that generating one image from an AI model can use as much energy as it takes to fully charge a smartphone. But these models are new, are constantly being refined and are many iterations away from reaching a more efficient end-product.

Meanwhile, although much of the focus is directed at big, fuel-guzzling tech firms like Google and Anthropic, there are plenty of smaller AI businesses set up with the expressed purpose of cutting emissions. Here’s one example. A initiative by students at UCL called Carbon Re uses AI to pinpoint the optimal manufacturing process for creating the lowest possible carbon dioxide output and fuel use in energy-intense industries like cement and steel manufacturing. The new technology could cut the CO2 emissions of businesses in these industries by up to 20 per cent.

The global cement industry alone is thought to be responsible for as much as 8 per cent of the world’s emissions. A 20 per cent saving in emissions is almost equivalent to the combined CO2 emissions of all the data centres in the world.

Other startups are also making strides to cut waste in areas like air travel, supply chains and building management. Together, these firms can, in theory, more than offset the entire emissions of the AI industry.

This technology remains in its infancy. AI has been around long before ChatGPT emerged on the scene, but it’s only in the last few years that artificial intelligence businesses have seen major, worldwide investment, including from governments, private equity and venture capital.

In 2014, 65 firms with ‘AI’ in their name were registered in the UK. In the first four months of this year alone, that figure has risen to over 700. Once the sector reaches maturity, the next phase will be one of consolidation and fierce competition – including competing on costs by streamlining models and cutting energy use. The technology will get better.

Now, therefore, is not the time to be barking fluffy terms like ‘sustainability’ and ‘Net Zero’ at AI businesses and demanding answers. There is no harm in requiring greater transparency, but the industry is far more likely to be the solution to many of these challenges than the problem. The greater danger is that its advances are stymied by short-term thinking over hitting emissions targets and writing ESG reports.

Comments