Yes Energy News and Insights

Data Centers Demystified: What Are They, Why Does AI Need Them, and Where Are They Coming?

Written by Laura Fletcher | Jul 09, 2025

Data centers are the physical backbones of the digital world. These secure, climate-controlled facilities house the servers and storage infrastructure that process and manage the information behind everything from emails and apps to cloud platforms and video streaming.

To understand their growing impact on the power grid and prices, it helps to start with the basics.

How Do Data Centers Work?

At their core, data centers are ecosystems of servers, storage systems, and networking equipment designed to exchange, process, and store data. They enable access to information and data, supporting everything from websites to e-commerce, social media and cloud storage. 

Ranging in size from a single room to sprawling campuses spanning millions of square feet, these facilities operate 24/7 and rely on supporting systems, such as cooling, power distribution, and network management, to ensure uptime, performance, and security. 

Types of Data Centers

Not all of these are the same. They vary widely in size, purpose, and ownership – and these differences shape their power needs and infrastructure design. Here’s a quick overview of the most common types.

Colocation (Colo)

Colocation data centers, also known as colos, are third-party facilities where companies rent space to house their servers and equipment. The colo provider supplies power, cooling, connectivity, and security, while customers manage their own hardware. These facilities are ideal for businesses that want to maintain control over their IT equipment but prefer not to invest in building and operating a dedicated data center.

Cloud

Operated by providers like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud, cloud data centers host the physical infrastructure – servers, storage, and networking – needed to deliver computing services over the internet. Instead of owning and maintaining hardware, businesses can access these resources on demand to run applications, store data, and scale workloads.

There are three main types of cloud deployments:

  • Public cloud: Shared infrastructure accessed over the internet by multiple customers
  • Private cloud: Dedicated infrastructure that one organization uses exclusively
  • Hybrid/multi-cloud: A mix of public and private environments, often spanning multiple providers

Operators often deploy cloud infrastructure in colocation facilities or integrated with on-premises systems, offering greater flexibility and geographic reach.

Hyperscale 

These are massive, highly automated campuses built by tech giants like Amazon, Microsoft, and Meta to support large-scale digital workloads, such as search, streaming, and, increasingly, artificial intelligence (AI). 

These facilities operate as distributed networks and are designed with advanced automation and built-in redundancies, such as backup power systems and failover hardware, which ensure services continue to run even if a component fails.

AI-Centric

These are purpose-built or retrofitted facilities that support artificial intelligence workloads. They house dense clusters of specialized processors – such as GPUs and other AI chips – designed to handle the massive computations required for training and running AI models. 

These facilities typically require advanced liquid cooling systems and consume significantly more energy than traditional cloud or enterprise data centers.

Enterprise

These are private facilities owned and operated by a single large company – often in the finance, healthcare, government, or tech sectors – with massive amounts of data. They’re dedicated solely to serving the owner’s needs and are typically located near a company’s campus, providing the owner with complete control over infrastructure, security, and compliance. 

These facilities prioritize reliability and are often tightly integrated with the organization’s operations.

Edge

Edge data centers are compact, distributed sites located near users of the data source to minimize latency. 

Used for real-time applications like Internet of Things (IoT) devices and augmented or virtual reality (AR/VR), they process information locally rather than routing it to large, centralized facilities.

Modular

These are pre-built, containerized units that owners can quickly deploy in remote or temporary locations. Essentially a “data center in a box,” they’re equipped with power, cooling, networking, and compute and are ideal for fast deployment needs.

Powering the Cloud: Data Centers and Their Growing Energy Footprint

Data centers are among the most energy-intensive building types, consuming 10 to 50 times more energy per square foot than a typical office. Today, they account for roughly 2% of US electricity consumption, and that share is increasing.

Globally, data center electricity demand stands at around 60 GW, with forecasts projecting it could more than triple to 171 to 219 GW by 2030. 

The biggest driver? 

The exponential energy demands of artificial intelligence. Goldman Sachs expects a 165% increase in data center power consumption by the end of the decade, primarily due to the growing demand for AI training and inference workloads.

How Big Are Data Centers?

Type

Square Feet

Server Count

Energy Use

Hyperscale

10,000+

5,000 - 10,000+

80+ MW

Enterprise

Few thousand to 100,000+

Hundreds to thousands

1-10 MW annually

Colocation

Varies

Varies

0.1 MW - 1 MW

AI-Centric

4 million+

Thousands to millions of GPUs

0.08 MW-0.12 MW per rack 

Edge

Varies

Dozens to hundreds

0.5 MW -2 MW

 

Why AI Needs Data Centers

AI isn’t just software – it’s one of the most infrastructure-intensive workloads in computing. Training large models, such as OpenAI’s GPT-4 or Google’s Gemini, requires thousands of specialized processors running in parallel for weeks or even months. This process consumes tens of megawatts of power, enough to power a small city.

These models rely on high-density server racks, advanced cooling systems, and ultra-fast interconnects to transfer information efficiently. According to Goldman Sachs, generative AI workloads can use 10 times more computational power than traditional applications. 

Once trained, these models are deployed across global data center networks to ensure low latency and fast response times for real-time applications like voice assistants, search, video analytics, or autonomous vehicles. Simply put, modern AI isn’t possible without the performance, scalability, and resilience of today’s centers.

Data Center Giants

One of the largest US facilities is Meta’s 4.6 million-square-foot. campus in Prineville, Oregon. The company’s new AI data center in Louisiana will be 4 million square feet.

The Switch Citadel campus in Nevada has a 650 MW capacity and spans 7.2 million square feet.

OpenAI, the powerhouse behind ChatGPT, is linked to plans for a 5 GW campus in Abu Dhabi – an installation large enough to consume power equal to that of five nuclear reactors.

How Are Data Centers Powered?

Most draw power from the local utility grid, which offers stable and continuous electricity. However, grid upgrades can lag behind demand, especially in fast-growing markets, making grid power a potential bottleneck. 

Renewable energy can help meet the growing need for sustainable power. For example, Apple has powered 100% of its data center operations exclusively with renewable energy contracts (including biogas fuel cells, hydropower, solar, and wind) since 2014.

To increase energy reliability, many facilities are adopting microgrids – localized energy systems that can operate independently or in conjunction with the grid. Microgrids often incorporate renewable energy and fuel cells. 

Looking ahead, small modular reactors (SMRs) are emerging as a promising option. These compact nuclear reactors could provide stable, low-carbon power for facilities with very high energy demands. While still in development, SMRs could become a key part of energy strategies, especially for facilities with massive power demands.

How Many Data Centers Are in the US?

The US is home to the largest concentration of centers in the world, with an estimated 5,400 facilities spread across the country. 

Nearly 20% of these (around 1,135) are hyperscale data centers. 

Where Are Data Centers Located?

While they are located in nearly every state, their development is concentrated in key metro regions where power availability, land, tax incentives, and connectivity align. 

Northern Virginia is the world’s largest market with 5.9 GW in operation, 1.8 GW under construction, and an additional 15.4 GW in the pipeline. 

Source: Yes Energy’s Infrastructure Insights

Commercial real estate services firm CBRE identifies the other primary US markets as Dallas-Fort Worth, Silicon Valley, Chicago, Phoenix, the New York Tri-State area, Atlanta, and Hillsboro. 

As the primary markets become saturated and reliable power and networking infrastructure become harder to secure, developers are expanding their footprint beyond traditional hubs. 

In the Midwest, Columbus, Ohio, is emerging as a major data center hub, adding 6.1 million square feet between 2015 and 2024. Des Moines, Iowa, added 5.6 million during the same time frame. Only the Northern Virginia and Dallas markets have built more capacity over the past decade. 

Measuring the Impact of Data Centers on the Grid

Yes Energy’s Infrastructure Insights platform provides a nationwide view of data center development, capturing key project details – who’s building, what’s being built, when it’s coming online, where it’s located, and why it matters. Whether a facility has just been announced, is in development, or is under construction, the system tracks it all.

As information becomes publicly available, Infrastructure Insights shows specifics like expected power load, square footage, and the number of buildings. An interactive map allows you to filter and visualize this information with ease, revealing emerging hotspots and development trends across the US.

Source: Yes Energy’s Infrastructure Insights

Selecting a specific facility on the map displays rich project-level details, including the project name, owner, expected in-service date, power capacity, and a comprehensive description.

Source: Yes Energy’s Infrastructure Insights

Yes Energy’s EnCompass is a powerful planning model that enables you to simulate future energy scenarios, forecast prices, analyze grid congestion, and evaluate generator behavior. Utilities and energy asset developers use it to understand how new large loads could impact electricity markets and grid performance.

EnCompass flags potential issues, like congestion or overloads, down to the level of transmission lines, substations, power plants, and buses. These insights inform key decisions around where to site new generation and how to price power in evolving markets.

For developers, EnCompass offers an essential tool for site selection. By modeling the expected load at different candidate locations, developers can compare options based on their specific priorities – whether that’s the lowest electricity cost, highest reliability, or most favorable grid access.

Beyond current infrastructure, EnCompass also supports future-looking scenarios. You can test the impact of proposed assets – like new transmission lines or generation facilities – on power delivery to a given site.

Next Steps

Check out our post on data center efficiency and how they are growing across North America. 

If you’re looking to monitor large load centers’ energy use in real time, learn more about how to do this efficiently with Live Power’s latest update. 

If you’re looking to develop a facility, see three things you need to know first.

Ready to see the impact of large loads near your trading or operating location? 

Contact us to learn how Infrastructure Insights and EnCompass can optimize your simulations and enable better data-backed decisions.

About the author: Laura Fletcher is on the Yes Energy product team as an associate product manager. Prior to joining the team, Laura studied environmental engineering at Georgia Tech. She started working with energy data as a college intern, and she has worked on various consulting projects, annual market forecasts, client relations, and database management.