Insights from Dr. Eric Masanet on Data Centers and Sustainability
Source: Getty Images IBM

Written by Michaela Galarza

The Growing Energy Needs of Data Centers

In the race to dominate artificial intelligence, technology companies are fueling an unprecedented expansion of data centers. These digital infrastructures power everything from search engines to cloud computing, and now, increasingly, artificial intelligence (AI) applications. But as AI's demand for computational power skyrockets, so too does its energy footprint.

Dr. Eric Masanet, a leading expert in industrial sustainability and the Mellichamp Chair in Sustainability Science for Emerging Technologies at the Bren School, has spent nearly two decades studying the evolving energy dynamics of data centers. In a recent conversation, he provided key insights into the challenges and solutions surrounding data center sustainability in the age of AI.

Dr. Masanet’s journey into the data center space was unexpected. With a background in engineering and a passion for sustainability, he initially worked in product design before pivoting to research. His expertise in industrial energy efficiency led him to co-lead a Congressional study on data center energy use in 2006, which sparked his interest in the field.

"Energy became a critical part of my research because decarbonizing manufacturing was my main goal," he explains. "You can’t improve industrial sustainability without considering energy sources and efficiency."

The Three Eras of Data Center Growth

According to Dr. Masanet, the evolution of data centers can be divided into three distinct eras: early growth, the cloud computing boom, and the AI acceleration that defines our current landscape. The early growth phase, spanning from the 1990s to 2010, was driven by the rapid expansion of internet infrastructure, including the proliferation of computers, internet services, and smartphones. However, this period was marked by a lack of concern for energy efficiency.

The second phase, from 2010 to 2018, saw the rise of cloud computing, which led to the consolidation of data centers into large-scale, centralized facilities operated by major technology companies. This shift significantly improved efficiency by replacing smaller, inefficient centers with hyperscale data centers which are massive facility that houses thousands of servers and is designed to process large amounts of data designed for optimal energy use.

Today, the AI acceleration era is characterized by an explosion in energy demand. AI-driven applications require specialized, power-intensive hardware, and the rapid deployment of AI technologies is pushing data center energy consumption to unprecedented levels.

Why AI Is Driving an Energy Surge

Unlike traditional computing, AI workloads require specialized graphics processing units (GPUs) that consume significantly more electricity than conventional servers. AI model training, particularly for large-scale models like ChatGPT, involves processing vast amounts of data and billions of parameters, making it an energy-intensive process. Even after training, running the models to generate responses, known as the inference stage, requires vast amounts of energy.

"AI servers use up to 10 times the power of a standard server, and companies are deploying them at an unprecedented scale," Dr. Masanet notes. "The combination of high power needs and rapid expansion is what’s straining the grid."

The Sustainability Challenge

The increasing electricity demand from AI data centers poses a significant challenge to sustainability goals. Dr. Masanet points to Northern Virginia as a cautionary tale, where the region’s concentration of data centers has forced utilities to keep fossil fuel plants online to meet demand. While major tech companies pledge to power data centers with renewable energy, the reality is that the expansion is outpacing the deployment of clean energy sources.

"Some companies are looking at nuclear power or geothermal solutions, but these are not yet widely available," he says. "Renewable energy simply isn’t scaling fast enough to match AI’s growth."

Potential Solutions for AI’s Energy Footprint

Despite these challenges, there are several promising strategies to mitigate AI’s energy consumption:

  • Geographic Optimization: Placing AI data centers in regions with abundant renewable energy, such as Iceland or the Pacific Northwest, could reduce reliance on fossil fuels.
  • Workload Shifting: Cloud providers can shift AI tasks to locations where renewable energy is most available at any given time.
  • Algorithmic Efficiency: AI models can be trained using fewer data points and optimized software to reduce energy consumption.
  • Cooling Innovations: Advanced liquid cooling systems can improve efficiency by reducing the energy required to cool AI servers.

However, these solutions come with limitations. "Even with efficiency gains, AI’s energy footprint is still expected to grow," Dr. Masanet warns. "The key is ensuring that this growth aligns with sustainable energy deployment rather than exacerbating fossil fuel dependence."

Looking Ahead: Will AI Energy Demand Level Off?

While AI energy demand is surging now, Dr. Masanet believes it won’t continue indefinitely. "Some business models will collapse, and AI data centers won’t always run at full capacity," he predicts. "Efficiency improvements will happen, but for now, rapid expansion is the reality."

The challenge remains: Can the tech industry align AI’s rapid rise with sustainability goals? As data centers become the backbone of the digital economy, their energy consumption will be a critical issue in the fight against climate change.

Dr. Masanet’s work highlights the urgent need for innovation, transparency, and policy intervention to ensure that AI’s benefits outweigh its environmental costs. The future of sustainable computing depends on it.

Contacts
Eric Masanet
Professor and Mellichamp Chair in Sustainability Science for Emerging Technologies
Energy System Analysis, Climate Change Mitigation, Sustainable Manufacturing, Data Centers & ICT