It has been only a few years since the release of ChatGPT, but the rapid integration of artificial intelligence into various sectors has triggered a surge in datacentres, leading to escalating environmental costs.
Globally, datacentre power demand is growing four times faster than all other sectors combined, according to the International Energy Agency, and is projected to surpass Japan’s electricity consumption by 2030.
In Australia, the energy market operator expects datacentre energy demand to triple within five years, exceeding the electricity used by the nation’s fleet of electric vehicles by 2030. Authorities also foresee significant pressure on drinking water supplies.
As the QuitGPT movement—a boycott of AI over its use in surveillance and weapons—gains traction, should those concerned about AI’s environmental impact also consider opting out?
How Bad Is AI for the Environment?
Estimates vary, but most studies agree that generative AI models—which produce text, images, and video—consume orders of magnitude more energy than traditional computing methods.
Some estimates suggest AI uses five times more energy, while others indicate it could be significantly higher. Much depends on the specific model or type of query.
Professor Jeannie Paterson, co-director of the Centre for AI and Digital Ethics at the University of Melbourne, highlights the limited transparency from tech companies regarding the energy, water, and emissions impacts of AI and datacentres.
“But it’s clear that training models and running datacentres is an energy-intensive task,” she says.
“Consumer software that generates text, images, and videos is uniquely energy inefficient,” says Ketan Joshi, an Oslo-based climate analyst affiliated with the Australia Institute, due to the “vast datasets and computational strain of pattern-matching that happens underneath the hood.”
Asking an AI chatbot a question consumes significantly more energy than finding the answer via a simple web search or calculator. He compares it to driving to the shops in an SUV instead of riding a bike.
“You might still get the shopping done, and that single trip alone may not look all that bad in terms of cost or emissions, but what happens when that’s all of your trips, and when all of society starts doing this?”
A study published in the journal Patterns estimates AI’s global carbon footprint at 32.6 to 79.7 million tonnes of CO2 emissions in 2025, and its water use at 312.5 to 764.6 billion litres—comparable to the global consumption of bottled water.
In Australia, the expansion of datacentres for processing and storing AI data is forecast to slow the energy transition, increase emissions, and raise power costs for consumers.
“That’s a lot of energy demand for unclear or small societal benefit,” Joshi says. “Compare that to the global benefit of video-calling technology, which has reduced flights and enabled communication during the pandemic.”
AI Is Everywhere. Is It Possible to Opt Out?
AI tools are increasingly embedded in workplace and educational software, as well as chatbots used by banks and local governments. Generative AI is also being introduced in supermarket self-checkouts, facial recognition at hardware stores, and for transcribing doctors’ notes.
“We’re becoming immersed in this technology,” Paterson says. “It’s really hard to avoid.
“But we still have a chance to express our views about what and how we want AI to be used.”
There are small ways to limit use—such as saving energy by switching off lights or appliances. People can unsubscribe from AI platforms, exclude AI results from search (for example, by adding “-AI” to the end of a search query), or avoid using AI for unnecessary or energy-intensive tasks like text-to-video prompts or AI-generated images for celebrations or work presentations.
“Meta, Google, and Microsoft have all integrated [generative AI] deeply into their systems,” Joshi says. “I see this as part of a tactic to embed these systems into society and instil dependency, similar to the growth of single-use plastics in the 1970s.”
Opting out can be a “meaningful act of resistance,” Joshi adds. “It’s partly about not creating that energy demand but mostly about being part of broad collective action against a corrosive, harmful industry.”
Consumer boycotts can be powerful, he says, but he is disheartened by QuitGPT’s approach of funneling users from one platform to another rather than quitting AI entirely. QuitGPT has encouraged users to cancel ChatGPT subscriptions while promoting the use of Anthropic’s Claude. He describes this as a “cynical exploitation” of widespread opposition to AI.
What About the Impacts of Datacentres on Local Communities?
Datacentres—rapidly increasing in number and size—are the physical embodiment of the AI boom. There are growing calls for the industry to be held accountable for its environmental impacts.
A coalition of energy and environment groups, including the Clean Energy Council, Electrical Trades Union, Australian Conservation Foundation (ACF), and Climate Energy Finance, has proposed a set of “public interest principles for datacentres” that include investing in new renewable energy and using water responsibly.
“If you want to build a datacentre, you should have to build the renewables and water recycling to power it,” says ACF chief executive Adam Bandt. “Big tech corporations should be forced to do their fair share so they don’t drain our resources.”
Beyond energy, water, and emissions, datacentres can impact local communities and wildlife living near these facilities. These massive warehouse-like buildings operate 24/7 with continuous lighting and the sound of air conditioners running.
Some communities have taken action, campaigning against large datacentres proposed in their areas.
Dr Bronwyn Cumbo, a transdisciplinary social researcher at the University of Technology Sydney, explains that these nondescript buildings are often clustered together, forming industrial hubs rather than standalone datacentres.
“Of course, it’s in their interest to communicate, engage with the community, incorporate local knowledge, and consider local concerns because they want to be good neighbours. But the incentive to be a good neighbour really depends on the company,” she says.
Cumbo notes that discussions about AI’s relationship with the physical environment and its social, political, and economic implications are intensifying.
Raising awareness is crucial, she says, so communities can think critically and know what questions to ask.
“There is an inevitability to AI being part of our lives, but how it’s part of our lives is something we can definitely control.”
1 day ago