avatar image
Advertisement

Inside Out | Let’s talk about how to power AI, the new energy-guzzling industry

  • It’s not just how we can produce and afford the electricity, but also how we ensure we don’t aggravate global warming and squeeze domestic water supplies

Reading Time:4 minutes
Why you can trust SCMP
0
People reflected in a hotel window at the Davos Promenade in Switzerland on January 15. Photo: AP

For all the excitement focused on the artificial intelligence (AI) revolution, I can’t help but focus on its precarious foundations.

I would like to buy into the excitement over all the good things AI may bring – from enhanced productivity and efficiency, to previously unimaginable medical treatments, safer traffic management, augmented food production and new effective ways of reducing carbon dioxide emissions.
But I am whipsawed by a troubling list of vulnerabilities, from Nvidia supplying 95 per cent of the graphics processing units (GPUs) needed to power AI learning machines, to the near monopolies of ASML in advanced chipmaking machines and Taiwan Semiconductor Manufacturing Company (TSMC) in advanced chip production.
Put on one side the tripwires in the deepening US-China geopolitical conflict, the challenges of agreeing on regulations over such powerful new technologies, and the quirky “hallucinations” produced by tools like ChatGPT. Also put on one side the arguments on whether the new technologies will end up creating or destroying more jobs, or whether AI will widen inequalities between the rich and poor parts of the world.
The worry I am wrestling with has been much less discussed: AI’s voracious appetite for electricity. Not just how we can produce and afford it, but also how we ensure we don’t aggravate global warming and squeeze domestic water supplies as we build the facilities that power the AI revolution.
As OpenAI founder Sam Altman said in January: “We still don’t appreciate the energy needs of this technology.” All we know is that machine learning is extremely energy-intensive. According to The Verge, the training of a large language model like GPT-3 is estimated to require 1,300 megawatt-hours – enough to power 130 US homes for a year or stream 1.6 million hours of Netflix. In 2022, Google said machine learning accounted for 15 per cent of its energy bill over the previous three years.
Advertisement