background image

Practicing sustainable artificial intelligence

author image

By Niladri Choudhuri

· 5 min read


Artificial Intelligence (AI) significantly impacts both the environment and the economy. Environmentally, AI systems consume substantial energy and water resources. For instance, training large AI models can emit over 50 metric tons of CO₂, equivalent to approximately 80 round-trip flights between London and New York. Additionally, data centers housing AI servers are major water consumers; in Virginia's "data center alley," water usage soared from 1.13 billion gallons in 2019 to 1.85 billion gallons in 2023. This extensive water consumption, primarily for cooling purposes, raises concerns about resource depletion, especially in water-stressed regions. 

Economically, AI's benefits are unevenly distributed, potentially leading to wealth concentration among a few dominant firms. The development and deployment of advanced AI systems require significant resources, often accessible only to large corporations. This concentration can stifle innovation and limit opportunities for smaller entities. Advocates argue that open-source AI models can democratize access, fostering innovation and preventing wealth consolidation. For example, Meta's Chief AI Scientist highlighted that open-source models are surpassing proprietary ones, emphasizing the importance of open research in AI development. 

Practicing Sustainable AI is crucial to mitigate these negative impacts. This involves developing energy-efficient models, utilizing renewable energy sources, and promoting open-source AI to ensure equitable economic benefits. By adopting Sustainable AI practices, we can harness AI's advantages while minimizing its environmental footprint and promoting economic inclusivity.

Practicing sustainable artificial intelligence (AI) involves integrating environmental, economic, and social considerations throughout the AI lifecycle. Here are some thoughts on implementing sustainability from the initial "Why?" to the deployment and use phases.

1. Defining the purpose ("Why?")

Assess necessity: Before initiating an AI project, evaluate if AI is the appropriate solution. Consider simpler, less resource-intensive alternatives that might achieve the same objectives
Set sustainable objectives: Align the project's goals with sustainability targets, such as minimizing energy consumption, reducing carbon footprint, or promoting social good

2. Design phase

Energy-efficient architectures: Opt for model architectures known for efficiency. For instance, Mixture-of-Experts (MoE) models like DeepSeek-V2 activate only a subset of their parameters during inference, reducing computational load and energy consumption. Another example is K-Nearest Neighbour (KNN) is recognized as one of the most energy-efficient algorithms, consuming only 0.01 Joules on average
Modular design: Design systems that allow for component reuse and easy updates, reducing the need for complete retraining and thereby conserving resources

3. Development phase

Efficient coding practices: Implement code optimizations to reduce computational overhead. Efficient algorithms can significantly decrease energy consumption during both training and inference
Collaborative development: Engage in open-source projects to share resources and avoid redundant efforts, promoting community-driven improvements and resource efficiency
Energy-efficient algorithms: Develop and implement algorithms that require less computational power. For example, researchers are exploring more efficient algorithms to reduce the environmental impact of AI

4. Algorithm selection

Sparse models: Utilize sparse models that require fewer active parameters during computation. DeepSeek-V2, for example, employs a sparse architecture, activating only 21 billion out of its 236 billion parameters per token, leading to energy savings
Quantization and pruning: Apply techniques like quantization and pruning to reduce model size and complexity without significantly compromising performance

5. Hardware utilization

Energy-efficient hardware: Choose hardware optimized for AI tasks, such as GPUs with lower power consumption or specialized accelerators. Micro Data Centres are good choice to further green the AI processing
Renewable energy sources: Where possible, power data centers and hardware with renewable energy to minimize the carbon footprint 

6. Model size and parameters

Right-sizing models: Develop models that are appropriately sized for the task. Overly large models consume more energy and resources. For instance, while DeepSeek-V3 has 671 billion parameters, it activates only 37 billion per token, balancing performance with efficiency
Performance monitoring: Continuously monitor model performance to ensure that increasing size translates to meaningful improvements, justifying the additional resource consumption

7. Training process

• Efficient training regimens: Implement training techniques that reduce computational load, such as mixed-precision training or transfer learning
• Data augmentation: Use data augmentation to enhance model performance without the need for excessively large datasets, thereby reducing training time and energy use
Elite sample training: Employ techniques like elite sample training, which focuses on selecting the most informative data samples, thereby reducing the amount of data needed for training and conserving energy

8. Data management

Ethical data sourcing: Ensure data is sourced responsibly, with considerations for privacy and consent, aligning with regulations like GDPR
• Data minimization: Collect only the data necessary for the task to reduce storage and processing requirements
Efficient data processing: Implement data processing techniques that minimize energy consumption, such as efficient data retrieval and storage methods

9. Deployment and use

Scalable deployment: Deploy models in a way that allows scaling resources up or down based on demand, preventing wasteful over-provisioning
User education: Inform users about the environmental impact of AI services and encourage responsible usage patterns

10. Monitoring and maintenance

Continuous monitoring: Regularly monitor the AI system's performance and energy consumption to identify areas for improvement
Life cycle assessment: Conduct life cycle assessments to evaluate the environmental impact of the AI system throughout its usage and implement strategies to mitigate negative effects

By implementing these steps, AI practitioners can develop systems that are not only effective but also sustainable, minimizing environmental impact while maximizing societal benefits.

illuminem Voices is a democratic space presenting the thoughts and opinions of leading Sustainability & Energy writers, their opinions do not necessarily represent those of illuminem.

Did you enjoy this illuminem voice? Support us by sharing this article!
author photo

About the author

Dr. Niladri Choudhuri is Founder & CEO of Xellentro Consulting Services, with expertise in sustainable IT.

Other illuminem Voices


Related Posts


You cannot miss it!

Weekly. Free. Your Top 10 Sustainability & Energy Posts.

You can unsubscribe at any time (read our privacy policy)