2.4 Environmental Impact of AI
Environmental Impact of AI

The rapid growth of AI technologies (and cloud-based technologies in general) has a hidden environmental impact that we should also be aware of when we consider adopting these tools. Due to a lack of transparency from AI companies, it is difficult to provide exact impact data. However, training, housing, and running Generative AI models all consume a huge amount of energy, and these resource requirements may continue to grow as AI continues to develop.
Hardware: the physical resources required for Generative AI hardware and infrastructure involves extensive mining and extraction of minerals, which can cause a huge environmental impact on communities (Hosseini et. al., 2025 ).
Training: Training Generative AI models required a huge amount of energy. For example, it has been estimated that creating GPT-3 resulted in carbon dioxide emissions equivalent to the amount produced by 123 gasoline powered vehicles driven for a year (Saenko, 2023 ).
Usage: using ChatGPT also has a substantial water footprint. It is estimated that a ChatGPT dialogue with 20-50 prompts uses approximately 500ml of water (McLean, 2023 ).
- Large language models are constantly running, meaning that data centres that run the models are always in full operation. The data centres that power these models currently account for 3% of global energy consumption (Cohen, 2024
).
It’s important to be aware of the environment impact and make sure we engage in responsible and considerate use of AI technologies.
See AI’s impact on energy and water usage for a review of recent research on the environment impact of Generative AI technologies.
A subset of Deep Learning that can use learned rules or patterns to generate new content.
A prompt is the text that is provided to the system providing instructions on the desired output or the task being requested.
Examples of prompts:
For text-to-text:
Write a detailed case study demonstrating environmental racism in a Canadian context for a class of first year university students at a Canadian university.
Summarize the key themes of Orwell’s 1984 in a bulleted list and simple English.
For text-to-image
Generate a photorealistic depiction of Six Grandfathers Mountain before it was carved into Mount Rushmore.
Large Language Models (LLMs) are computational models that are trained on huge datasets of text to recognize common patterns and relationships in natural language. They can be used for generating texts that mimic human language.