Generative AI

Advice in a rapidly changing world

The AI Boom in HE

There has been a rapid adoption of generative AI into many aspects of academic life. In UK universities, many students, researchers, teachers, and administrators are adopting these tools to help summarise lecture notes, analyse complex datasets or act as virtual tutors. In some cases they are using tools provided by the institution, in others they may be using free tools, or ones they subscribe to personally. The pace of this adoption has been staggering; services like ChatGPT reached 100 million users in just two months, a milestone that took Facebook nearly five years.

While the attraction of these technologies to an individual are clear, their widespread use comes with an environmental cost. The purpose of this section of the blog is to move beyond the immediate utility of generative AI and explore its environmental footprint.

Environmental concerns stem from several areas:

  • The impact of creating, running and ultimately decommissioning the huge data centres.
  • The costs involved in the lengthy training of models.
  • The cost of running queries against models.

These factors are hard to quantify because of the lack of available data. One such site that attempts to do this is the AI Energy Calculator.

Putting these figures into context

Deconstructing AI’s Environmental Footprint: A Life Cycle View

The environmental cost of a generative AI query is not restricted to the electricity consumed at the point of use. There is a complex economic life cycle, with significant impacts occurring before a model runs for the first time and extending long after the hardware is retired. This life cycle reveals two peaks. The manufacturing phase is responsible for the majority of direct human health toxicity, while the operational phase is the overwhelming driver of climate change impacts.

The Four Stages

Disposal

Millions of tonnes of toxic waste generated from hardware with a short lifecycle.

Use

Massive energy consumption to power the server farms. Water used to cool them.

Production

Extraction of raw materials, energy used during manufacture and component assembly.

Transport

Getting the servers, GPUs and networking hardware to the correct locations.

Embodied Impact: The Cost of Manufacturing

The environmental damage begins with the production of the hardware that powers generative AI, particularly the high-performance Graphics Processing Units (GPUs) produced by companies such as Nvidia.

The intense computational demands from AI models have shortened the average lifespan of these GPUs from a typical life of between 5 and 7 years to just 2 to 4 years. This accelerated obsolescence is a major contributor to the global electronic waste crisis, which generated 54 million metric tonnes of discarded electronics in 2019 alone (Forti et al., 2020).

Manufacturing the hardware requires the mining of minerals like cobalt, lithium, and various rare earth elements. This extraction is often associated with severe environmental degradation and social injustices, including the exposure of workers and communities in developing nations to toxic materials such as lead and mercury. A life cycle assessment of the Nvidia A100 GPU, found that the hardware manufacturing stage accounts for 94.5% of the total human toxicity (cancer) impacts. The health effects are concentrated in East and Southeast Asia, where most semiconductor fabrication plants are located.

Operational Impact: The Thirst for Power and Water

Energy Consumption

Day to day use of generative AI to respond to queries requires large energy-hungry data centres.

Generative AI models require significantly more computational power than traditional search tools. A 2025 study shows that a single short query to GPT-4o consumes approximately 0.42 Wh of electricity—about 40% more than a standard Google search (0.30 Wh). At the global scale, the authors claim that in 2025, the carbon dioxide emission from generative AI use was equivalent to that from the city of New York.

Water Consumption

Data centres consume enormous quantities of fresh water, often millions of litres per day. This is primarily to cool the densely packed servers and requires very clean water to avoid contamination or blockages. This demand for water place a significant strain on local supplies, particularly in regions already prone to drought.

Carbon Emissions

The high energy consumption of data centres translates directly into a substantial carbon footprint. While the exact emissions vary depending on the energy mix of the local electrical grid, the operational phase is dominant in this category. The same Nvidia A100 GPU study revealed that while manufacturing accounts for most of the human toxicity impact, the operational phase is responsible for almost 97% of the climate change impact.

This stark difference arises because manufacturing’s toxicity is rooted in the mining of heavy metals and the chemical-intensive processes of fabricating semiconductors, whereas the operational climate impact is driven by the continuous electricity demand of data centres, often powered by fossil-fuel-heavy grids.

Towards more sustainable use of generative AI

We can all make conscious choices to reduce our collective digital footprint. The following steps provide a practical framework for using AI more sustainably.

Step 1: Be a Mindful User

Before turning to generative, ask yourself if it is the most appropriate and efficient tool for the task. For instance, before generating a complex image for a presentation, consider if a simple, low-energy vector graphic or a stock photo would suffice.

If you are writing code, do you need live commentary and suggestions for every change from a generative AI tool, or could you do most of the thinking yourself and just switch to an AI tool when needed?

When summarising a research paper, is a full generative summary needed, or could a less intensive keyword extraction tool achieve the same goal?

The environmental cost of AI varies dramatically depending on the complexity of the operation. An analysis by Luccioni et al. (2024) demonstrates this clearly.

Relative Energy Cost of Common AI Tasks (per 1,000 queries)

TaskEnergy (kWh)
Text Classification0.002
Summarisation0.049
Image Generation2.907

Your choice of task really matters. Generating an image is orders of magnitude more energy-intensive than classifying a piece of text. Where possible, opt for smaller, specialised language models (SLMs) over large, general-purpose ones, as the specialised models are optimised for specific tasks and are therefore more efficient.

Step 2: Understand Your Provider’s Footprint

The location and operational efficiency of the data centre running the AI model can have a major impact. Cloud providers like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud operate data centres across Europe, including in the UK (London), Ireland, and mainland Europe (such as Frankfurt, Paris and Stockholm).

Providers are increasingly taking steps to improve their sustainability. In 2023, for instance, AWS began converting the diesel-powered backup generators in its European data centres to run on hydro-treated vegetable oil (HVO). This can reduce lifecycle emissions by up to 90%.

A key metric for data centre efficiency is Power Usage Effectiveness (PUE), which measures the ratio of total facility energy to IT equipment energy. While a 2024 survey reported an average PUE of 1.48 for European data centres, leading hyperscalers achieve much better results. Google, for example, reports a PUE of just 1.08 for its data centre in Saint-Ghislain, Belgium. When choosing a service, look for providers who are transparent about their PUE, water usage, and the percentage of renewable energy powering their operations.

Step 3: For Researchers—Optimise Your Workflow

For those in our community building or training AI models, several strategies can significantly reduce environmental impact:

  • Choose Efficient Architectures: Research shows that model architecture plays a key role in energy consumption. For instance, sequence-to-sequence models (like Flan-T5) can be more energy-efficient than decoder-only models (like BLOOM) of a comparable size.
  • Adopt Optimisation Techniques: Methods like quantization (reducing the numerical precision of the model’s calculations), pruning (removing redundant parts of the model), and using Mixture-of-Experts (MoE) architectures (activating only specialised parts of the model for a given task) can substantially lower the computational cost.
  • Advocate for Transparency: Follow the example set by the developers of the BLOOM model, who published a detailed “model card” disclosing its environmental impact data. Encouraging this practice across the research community fosters accountability and allows for more informed choices.

Step 4: Champion Institutional Change

Individual actions are powerful, but systemic change requires institutional commitment. Your voice can catalyse broader change. Consider advocating for the following initiatives within your department or through university governance channels:

  • Promote a “circular computing” culture that focuses on extending the life of hardware. By prioritising the retrofitting and reuse of equipment like GPUs, the university can directly combat the growing problem of e-waste.
  • Integrate sustainable AI practices into the university’s official digital literacy programmes and IT guidance for all students and staff.
  • Support adherence to emerging standards and regulations, such as those outlined in the EU AI Act, which will increasingly mandate responsible and transparent AI development and deployment. It is expected that similar legislation will soon follow in the UK.

Conclusion: Towards Responsible Innovation

The rapid expansion of generative AI offers great potential, but it comes with environmental responsibilities. The impacts span the full life cycle of the technology, from the mining of minerals for hardware to the immense energy and water consumed by data centres.

References

Forti V., Baldé C.P., Kuehr R., Bel G. (2020) “The Global E-waste Monitor 2020: Quantities,
flows and the circular economy potential.” United Nations University (UNU)/United
Nations Institute for Training and Research (UNITAR) – co-hosted SCYCLE Programme,
International Telecommunication Union (ITU) & International Solid Waste Association
(ISWA), Bonn/Geneva/Rotterdam. ISBN Digital: 978-92-808-9114-0. Available from https://www.itu.int/en/itu-d/environment/documents/toolbox/gem_2020_def.pdf

Luccioni, S., Jernite, Y. and Strubell, E. (2024) ‘Power Hungry Processing: Watts Driving the Cost of AI Deployment?’, The 2024 ACM Conference on Fairness, Accountability, and Transparency. FAccT ’24: The 2024 ACM Conference on Fairness, Accountability, and Transparency, Rio de Janeiro Brazil: ACM, pp. 85–99. Available at: https://doi.org/10.1145/3630106.3658542.