
As artificial intelligence (AI) continues to reshape our computational systems, economies and energy footprints, the question of who should regulate AI and how has become an extremely complex matter. There is currently a political struggle ongoing in Washington about governance over the sector, with large questions about what limitations should be placed on the rapidly evolving sector. The difficulty in answering these questions is exacerbated by high levels of secrecy and opacity within and around the sector.
One of the biggest concerns related to AI is its enormous and continually growing energy footprint and associated greenhouse gas emissions. The forecasted growth of the sector’s energy needs is so significant that many world leaders are starting to prioritise it as an immediate threat to energy security. It is estimated that by 2030, energy demand from data centres will have doubled.
“In the past few years, AI has transitioned from an academic pursuit to an industry with trillions of dollars of market capitalisation and venture capital at risk,” reports the International Energy Agency. The scale of the energy required to power this growth means that “the energy sector is therefore at the heart of one of the most important technological revolutions today.”
However, nobody knows exactly to what extent that’s true. Our best estimates of how much energy AI will use are vague at best, because we don’t even know how much energy AI is currently consuming. Researchers are striving to quantify and track exactly how much energy AI is consuming, but “this effort is complicated by the fact that major players like OpenAI disclose little environmental information,” according to a recent report from Wired.
As of May 2025, an astonishing 84 percent of all large language model traffic was conducted on AI models operating with zero environmental disclosure. Sasha Luccioni, climate lead at an AI company called Hugging Face, is leading a research team that’s attempting to analyse exactly how much energy AI engines are using, which will be able to inform better targeted and more appropriate policy actions.
The numbers around AI energy consumption that are currently being repeated in the media are generally derived from statements with no empirical backing. The reality is that putting a number to the energy footprint of a single AI query is all but impossible. Queries vary widely in complexity and in corresponding computing power. And some companies have sophisticated systems that use simpler systems (and therefore less energy) for simpler questions, while others do not.
The cost of increased energy demand driven by AI will fall to consumers in the form of high energy bills. And this will do little to incentivise AI companies to employ less energy-intensive models.
The following content has been published by Stockmark.IT. All information utilised in the creation of this communication has been gathered from publicly available sources that we consider reliable. Nevertheless, we cannot guarantee the accuracy or completeness of this communication.
This communication is intended solely for informational purposes and should not be construed as an offer, recommendation, solicitation, inducement, or invitation by or on behalf of the Company or any affiliates to engage in any investment activities. The opinions and views expressed by the authors are their own and do not necessarily reflect those of the Company, its affiliates, or any other third party.
The services and products mentioned in this communication may not be suitable for all recipients, by continuing to read this website and its content you agree to the terms of this disclaimer.






