AI’s Growing Pains: Why Data Center Bottlenecks Could Stall the AI Revolution
- Niv Nissenson
- Jul 21, 2025
- 2 min read

AI may be moving fast — but the infrastructure needed to support it isn’t. In a recent post on LinkedIn, Kevin Pang, a strategist focused on innovation and go-to-market execution, laid out a sobering view of the road ahead in a post titled: Implementing Innovation Takes Longer and More Dollars than you Think
Build Times: 3 to 6 Years, Not 6 Months
Pang outlines a realistic timeline for hyperscale AI data centers. Site selection and permitting alone can take up to a year, followed by years of construction, power provisioning, and grid integration. While smaller 5–10 MW facilities might move faster, the kind of 100+ MW facilities required for frontier model training and inference often take 3–6 years to come online.

Build Costs: $700M–$1.2B for a 100 MW Site
AI is pushing data centers toward "Tier 3" and "Tier 4" classifications — high-redundancy, high-resilience facilities that cost upwards of $7M–$16M per megawatt. That puts a typical 100 MW site between $700 million and $1.2 billion. Pang cites OpenAI’s Stargate project (a $30B+ partnership with Oracle) as one of the more extreme examples of what AI-scale infrastructure looks like. And this is just the beginning — the International Energy Agency (IEA) projects 945 TWh of data center power consumption by 2030, up from 415 TWh in 2024. Goldman Sachs sees demand growing 165% by then.
Power Problems: Grid Constraints Are the Real Bottleneck
Even if capital is available, electricity may not be. The U.S. power grid has historically grown at just 1% annually. Meeting the full AI + EV + industrial electrification demand could require doubling grid capacity and laying 80 million km of new transmission lines, according to IEA estimates.
To make things worse, a recent study by Whisker Labs showed harmonic distortions and power surges within 50 miles of existing data centers — suggesting that residential and business infrastructure might already be suffering from proximity to hyperscale sites.
Backup Power: No Longer Optional
AI data centers require redundant, long-duration power systems, ranging from diesel generators and lithium batteries to more exotic options like flow batteries, molten salt, and thermal storage. Pang notes that even onsite generation is becoming a serious trend: 27% of data center operators now expect to be fully powered by on-site energy by 2030, up from just 1% a year ago.
This shift creates a $300B+ opportunity for energy innovators — but also signals a coming decoupling from traditional utilities.
Why It Matters
The excitement around AI is well-deserved — but Pang’s analysis reminds us that the revolution runs on real estate, steel, silicon, and electricity. Even the best algorithms can’t escape the physical limits of the grid.
Full credit to Kevin Pang for his original article:“Data Centers and AI: Implementing Innovation Takes Longer and More Dollars than you Think” (July 9, 2025) Available on LinkedIn.



