**Peer Review Journal ** DOI on demand of Author (Charges Apply) ** Fast Review and Publicaton Process ** Free E-Certificate to Each Author

Current Issues
     2026:7/1

International Journal of Multidisciplinary Futuristic Development

ISSN: 3051-3618 (Print) | 3051-3626 (Online) | Impact Factor: 8.31 | Open Access

Green AI: Energy-Efficient Deep Learning Models for Sustainable Computing

Full Text (PDF)

Open Access - Free to Download

Download Full Article (PDF)

Abstract

The rapid proliferation of large-scale artificial intelligence (AI) models has precipitated an unprecedented surge in computational demand, with profound environmental consequences. Training state-of-the-art language models now generates carbon emissions exceeding 8,900 tonnes—more than 250 times the annual footprint of an average American. This manuscript presents a comprehensive examination of Green AI, an emerging paradigm dedicated to reconciling deep learning's transformative capabilities with environmental sustainability. We critically analyse the energy consumption landscape across the AI lifecycle, documenting how training computational requirements have doubled approximately every ten months since 2012, while hardware power demands have escalated 5,000-fold from the original Transformer architecture to contemporary large language models. The paper systematically evaluates energy-efficient techniques including model compression (pruning, quantization, knowledge distillation), hardware-aware neural architecture search, and edge computing deployments that achieve up to 82% energy reduction with minimal accuracy degradation. We introduce two original comparative frameworks: a quantitative analysis of prominent models' carbon footprints spanning AlexNet (0.01 tonnes) to Llama 3.1 (8,930 tonnes), and a structured assessment of efficiency techniques with their accuracy-energy trade-offs. Beyond technical solutions, we examine policy developments including emerging carbon accounting standards and legislative frameworks such as the proposed Artificial Intelligence Environmental Impacts Act of 2024. The manuscript concludes by identifying critical research directions: sustainable scaling laws, federated learning for distributed computation, and quantum-inspired low-energy architectures. This work establishes that achieving genuine sustainability in AI requires not merely incremental efficiency gains but fundamental reorientation of how we design, train, and deploy intelligent systems—prioritizing algorithmic parsimony alongside predictive performance.

How to Cite This Article

Mr. Waleed Noman Alhajri (2025). Green AI: Energy-Efficient Deep Learning Models for Sustainable Computing . International Journal of Multidisciplinary Futuristic Development (IJMFD), 6(1), 89-97. DOI: https://doi.org/10.54660/IJMFD.2026.7.1.19-27

Share This Article: