AI does help us achieve net zero goals. But its ever enlarging and multiplying supportive datacentres on earth are energy guzzlers and not environment friendly. 

According to the International Energy Agency (IEA) report of 2024, the energy consumption of datacentres, AI and cryptocurrency sectors was 460 terawatt hours (one-trillion-watt hours) in 2022 but could touch 1000 TWh by 2026. This demand is roughly the equivalent to the annual energy consumption of Japan. 

According to estimates, the “cloud”, a vast network of remote servers worldwide which sustain big tech and AI, accounts for more global greenhouse emissions than commercial flights. Also, reportedly large language models such as ChatGPT consumed 700,000 litres of water to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities. Global data centres, on an average, add 50 per cent to energy costs just to keep the machines cool by air conditioning. 

An April 2024 analysis by the World Economic Forum (WEF) warned that the computational power required for sustaining AI’s rise to achieve efficiency could see a 10,000 times surge in demand for energy in the coming years. 

The analysis underlined that the development of AI must be balanced with sustainability measures. Power usage, it recommended, must be cut by 12- 15 per cent during the training and inference phases of AI systems. Also, collective infrastructure and centralized computational facilities must be opted for instead of private ones to reduce energy consumption. 

Many experts are of the view that regulators should start requiring energy use disclosures from AI developers since their operations are opaque.