Artificial intelligence has emerged as one of the most transformative technologies of the modern era, reshaping industries and influencing daily life in unprecedented ways. The growth of AI and the expansion of data centers are contributing to a surge in energy demand, leading to concerns about the impact on environmental sustainability and climate change. While the focus has primarily been on transitioning to clean energy supply sources, the discussion on future demand growth has been overlooked particularly in terms of understanding and predicting AI and cloud energy needs in the coming decade.
AI requires substantial computational power to train and deploy models effectively. The process of training AI models involves feeding them large amounts of data from various sources until they can recognize patterns and answer questions. This process can take months and is often repeated multiple times to fine-tune the model’s performance. Consequently, AI training consumes a staggering amount of electricity.
The scale of AI models has been growing exponentially, with each successive model containing significantly larger parameters and data than its predecessor. This growth in scale has further amplified the energy consumption and carbon footprint of AI.
In 2018, a large model contained around 100 million parameters. However, the trend accelerated with GPT-2, launched in 2019, boasting 1.5 billion parameters. GPT-3 pushed this further, with a staggering 175 billion parameters. The size of GPT-4 remains undisclosed though is expected to be even larger. Google’s Pathways Language Model, or PaLM, has an impressive 540 billion parameters.
AI Energy Consumption
A single AI training module can consume more electricity than the annual usage of 100 U.S. homes. In 2019, researchers discovered that developing a generative AI model named BERT, which consisted of 110 million parameters, “consumed energy equivalent to that of a round-trip transcontinental flight for a single individual.” The electricity consumption for training GPT-3 was equivalent to the average energy usage of 128 U.S. households over 14.8 days. The total energy consumption during GPT-3’s training was 1,287 megawatts. Furthermore, the carbon dioxide emissions generated from this training were equivalent to three round-trip flights from New York City to San Francisco. The trend toward larger models signifies a substantial increase in energy consumption, raising concerns about the environmental impact of AI advancements.
Data centers play a crucial role in supporting AI applications and cloud services. These data centers store and process vast amounts of data, serving as the backbone of the digital economy. Data centers consume a significant amount of electricity due to their constant operation and cooling requirements. And the energy demand of data centers is projected to grow substantially in the coming years. In 2016, data centers were responsible for approximately 1.15% of global electricity consumption and are expected to reach 1.86% by 2030.
Research from the Massachusetts Institute of Technology revealed that the cloud service currently has a larger carbon footprint than the entire airline industry. Furthermore, with the progression of AI technology, its energy consumption is expected to continue increasing. The researchers found that: “A single data center can consume the equivalent electricity of 50,000 homes. With an annual consumption of 200 terawatt hours (TWh), data centers collectively devour more energy than some nation-states. Today, the electricity utilized by data centers accounts for 0.3 percent of overall carbon emissions, and if we extend our accounting to include networked devices like laptops, smartphones, and tablets, this total shifts to 2 percent of global carbon emissions.”
Complexity of Measuring AI’s Energy Consumption
The precise energy consumption of an AI language model like GPT is not directly measurable in isolation. AI language models run on large-scale distributed systems spread across multiple servers and data centers. These systems handle numerous requests from users worldwide. The energy consumption is not attributed to a single model or server but shared across the entire infrastructure, making it challenging to isolate specific energy usage for any individual model. Further, as AI models are used for various tasks and by numerous users simultaneously, the workload and energy usage fluctuate, making it difficult to pinpoint precise energy consumption at any given moment. Additionally, AI language models rely on various supporting services, such as data storage, networking, and cooling systems. The energy consumed by these auxiliary services is also intertwined with the overall energy usage. Data centers and servers often run background processes and maintenance tasks, not directly related to user requests, which also contribute to energy consumption. These processes further complicate the measurement of energy usage for individual AI models.
The rapid expansion of the AI industry and the declining costs of computing power have made AI development and deployment more accessible than ever before. As AI technology advances, it finds its way into various applications, including search engines, everyday websites, autonomous vehicles, and more. Each of these applications contributes to the increasing energy demand of AI and cloud infrastructure.
Estimating future energy demand of AI and data centers is challenging due to several factors. First, the precise growth trajectory of AI technology is difficult to predict, and its integration into new applications and industries is constantly evolving. Second, the energy efficiency of AI algorithms and hardware is continuously improving, making it challenging to forecast future energy consumption accurately. However, a European Commission study suggests that data centers’ energy consumption in the European Union could increase by 28% by 2030 from that of 2018. Some estimates indicate that the integration of AI into search engines would require “at least four or five times more computing per search.” This raises questions about the necessity of AI in daily life and the measures needed to ensure its sustainability.
Traditional energy demand scenarios have been based on economic growth and population projections, but the explosive growth of AI and the expansion of data centers have introduced a new level of complexity.
Aligning AI Energy Demand With Climate Goals
The availability of clean energy sources is crucial to advance the energy transition. While the focus has been on ensuring energy supplies come from clean sources, future energy demand has been overlooked, particularly for AI and cloud computing. Without a clear understanding of the energy demand and growth of AI, it is challenging to align energy strategies with climate change goals.
To secure a sustainable and resilient future, it is crucial to engage in open and informed discussions with AI developers and stakeholders. A better understanding of AI’s growth trajectory and the potential impact of AI and clouds on energy consumption will enable informed decisions today regarding investments in clean energy sources, ensuring adequate supplies to meet the energy needs of the future.
Addressing the energy demands of AI and clouds requires a proactive approach that aligns with long-term sustainability goals. Investing in energy-efficient AI algorithms, innovative hardware, and renewable energy sources will play a pivotal role in reducing the environmental impact of these transformative technologies.
Moreover, policymakers and industry leaders should collaborate to develop regulations and incentives that encourage the responsible and sustainable use of AI and cloud computing. By creating an environment that promotes energy efficiency and responsible resource management, the AI industry can be steered toward a greener and more sustainable path.
In the face of unprecedented technological advancements, the energy needs of AI should be prioritized and balanced between innovation and environmental stewardship. By envisioning a future that embraces clean energy, invests in renewable resources, and incorporates AI’s energy needs into demand scenarios, the future of the planet can be safeguarded for generations to come.