Many people have seen enough of artificial intelligence by now to appreciate its potential benefits to humanity, as well as its risks. Not so many may be aware of what a big guzzler of natural resources it is. As the Financial Times reported this week, Big Tech companies are using ever more water to cool their energy-intensive data centres. The thirst for power and water of the server farms needed to run generative AI is even more intense. In Davos last month, Sam Altman, the OpenAI CEO, warned that future AI would consume so much electricity it would require an energy breakthrough — say, nuclear fusion — to power it.
There is an irony here, given tech companies’ image as shiny, clean replacements for old smokestack industries — and the pledges of Microsoft and others to be good climate citizens. The International Energy Agency says data centres, cryptocurrencies and AI accounted for almost 2 per cent of global power demand in 2022 — and this could double by 2026 to nearly match the electricity consumption of Japan. Ireland, a favoured location for server farms, is now limiting new data centre connections to the power grid; others are exploring whether to do the same.
AI has a stronger case than crypto — already well-known as a prodigious energy user — that its potential for social good can justify the power consumed. But in return for access to the grids, generation and scarce water resources they need, tech companies should accelerate the use of AI to assist the green transition. The technology has been touted as allowing more sophisticated weather forecasting — helping to balance power demand — and emissions tracking. It might help to find clever ways of cutting energy use in farming, manufacturing, supply chains and office buildings.