America’s AI Power Crisis: Inside Anthropic’s Urgent Blueprint to Build 50 Gigawatts of Infrastructure Before 2028

📌 Key Takeaways

  • 50GW Target: America must build at least 50 gigawatts of AI computing capacity by 2028 to maintain global competitiveness
  • China’s Advantage: China added 400GW of capacity last year vs America’s 40GW, with 3-6 month permitting vs America’s multi-year delays
  • 5GW Training Centers: Single AI model training will require unprecedented 5-gigawatt data centers by 2028-2030
  • Federal Solution: Lease DOD/DOE lands to bypass state regulations and accelerate gigawatt-scale facility development
  • Emergency Powers: Defense Production Act and Federal Power Act provide last-resort authority for national security infrastructure

The Stakes — Why AI Infrastructure Is Now a National Security Imperative

The United States faces an unprecedented infrastructure challenge that will determine whether America leads or follows in the artificial intelligence revolution. According to Anthropic’s “Build AI in America” report, the stakes couldn’t be higher: without dramatic action to build at least 50 gigawatts of computing capacity by 2028, the U.S. risks ceding AI leadership to China, with profound implications for national security, economic competitiveness, and technological sovereignty.

This isn’t a typical policy debate about incremental improvements. The scale of AI’s energy demands represents a fundamental shift in how we think about digital infrastructure. When a single AI training run requires as much power as a small city, and when China is building new capacity at ten times America’s pace, traditional regulatory approaches become national security vulnerabilities rather than prudent oversight mechanisms.

What makes this challenge particularly urgent is the winner-take-all dynamics of AI development. The nation that builds superior AI infrastructure first gains compounding advantages in military capabilities, economic productivity, and global influence that become increasingly difficult for competitors to overcome.

The 50-Gigawatt Challenge — Quantifying AI’s Unprecedented Energy Demands

The numbers behind AI’s energy requirements are staggering and represent a complete departure from traditional data center planning. Multiple independent forecasts converge on similar conclusions: the U.S. AI sector will require at least 50 gigawatts of electric capacity by 2028, with some projections reaching as high as 132 gigawatts when including broader data center growth.

Anthropic’s internal projections provide the most detailed breakdown of these requirements. The company expects to use 2-gigawatt data centers for training runs in 2027, scaling to 5-gigawatt facilities by 2028. When multiplied across the handful of organizations capable of frontier AI development, the math becomes clear: 20-25 gigawatts will be required solely for training the most advanced AI models, with additional capacity needed for inference, research, and commercial deployment.

These projections align with external research from leading institutions. The RAND Corporation projects 117 gigawatts of global AI power demand by 2028, while Lawrence Berkeley National Laboratory estimates data centers will operate using 74-132 gigawatts in 2028. Semianalysis, a respected technology research firm, forecasted 80 gigawatts of global AI power demand by 2028, with 56 gigawatts concentrated in the United States.

To put these numbers in perspective, 50 gigawatts represents roughly 5% of America’s total electricity generation capacity. Building this infrastructure requires not just generating capacity, but also transmission lines, substations, cooling systems, and backup power infrastructure—all of which must be planned, approved, and constructed within a compressed timeframe that challenges every aspect of America’s infrastructure development processes.

Training vs. Inference — Two Distinct Infrastructure Challenges Requiring Different Solutions

Understanding AI’s infrastructure requirements requires distinguishing between two fundamentally different use cases: training new AI models and deploying existing models for inference. These applications have different geographical, technical, and regulatory requirements that demand tailored infrastructure approaches.

Training represents the more concentrated challenge. Developing state-of-the-art AI models requires massive parallel computing resources located in a single facility or closely connected cluster. Anthropic projects that training runs will require 5-gigawatt data centers by 2028-2030, representing unprecedented concentrations of computing power. These facilities need access to stable, low-cost electricity, robust cooling infrastructure, and ultra-high-bandwidth connections between computing nodes.

Inference deployment, by contrast, requires distributed infrastructure closer to end users. When millions of users interact with AI models simultaneously, the computing load must be spread across multiple data centers to minimize latency and ensure reliable service. This distributed model requires different types of facilities: smaller, more numerous data centers located near population centers with different power, cooling, and connectivity requirements.

The regulatory implications of this distinction are significant. Gigawatt-scale training facilities represent such concentrated energy demands that they require dedicated power generation and transmission infrastructure, making them natural candidates for federal lands where streamlined approval processes can accelerate development. Distributed inference infrastructure, meanwhile, must integrate with existing electrical grids and urban planning processes across dozens of metropolitan areas.

Transform complex policy documents and infrastructure reports into interactive presentations that engage stakeholders and drive action

Try It Free →

The Regulatory Bottleneck — How Permits, Transmission Approvals, and Interconnection Delays Stall Progress

America’s regulatory framework for infrastructure development has become the primary obstacle to AI competitiveness. Anthropic identifies three critical bottlenecks that transform China’s 3-6 month construction timelines into America’s multi-year approval processes: environmental reviews, transmission line approvals, and utility interconnection procedures.

Environmental review under the National Environmental Policy Act (NEPA) represents the most visible regulatory barrier. While NEPA serves important environmental protection purposes, its implementation for AI infrastructure projects can require years of analysis for facilities that pose minimal environmental risks compared to traditional industrial development. The law’s requirement for comprehensive environmental impact statements creates opportunities for litigation and delay that are incompatible with the speed required for AI infrastructure development.

Transmission infrastructure faces even more severe bottlenecks. Only 55 miles of high-voltage transmission lines were built in 2023, compared to an annual average of 1,700 miles during 2010-2014. This dramatic decline reflects the complexity of siting transmission infrastructure across multiple state jurisdictions, each with different regulatory requirements, environmental standards, and political considerations. The result is a transmission grid inadequate for the geographic scale and power density requirements of gigawatt-scale AI facilities.

Utility interconnection represents the most technically complex bottleneck. Current procedures require 4-6 years for generation resources to connect to the electrical grid, a timeline that reflects utilities’ conservative approach to grid stability and their limited experience with the power characteristics of large-scale AI facilities. Grid modernization initiatives are beginning to address these challenges, but not at the speed required for AI infrastructure deployment.

The China Comparison — 400 Gigawatts vs. 40 Gigawatts and What It Means for Geopolitical Competition

The stark contrast between Chinese and American infrastructure development capabilities provides crucial context for understanding the urgency of Anthropic’s recommendations. China brought over 400 gigawatts of new generation capacity online last year, while the United States added roughly one-tenth that amount—a disparity that reflects fundamental differences in regulatory processes, government coordination, and infrastructure planning approaches.

China’s construction permitting timelines range from 3-6 months for major infrastructure projects, compared to America’s years-long processes that involve multiple agencies, extensive environmental reviews, and numerous opportunities for litigation and delay. This speed advantage compounds over time, enabling China to respond rapidly to emerging infrastructure requirements while America struggles to adapt its regulatory framework to new technological realities.

However, the comparison requires nuance. China’s rapid infrastructure development comes at the cost of environmental protections, community input, and democratic oversight that Americans value and that serve important purposes beyond infrastructure development. The challenge for American policymakers is preserving these democratic values while achieving infrastructure development speeds that enable technological competitiveness.

The geopolitical implications extend beyond raw infrastructure capacity. As China’s AI strategy gains momentum through superior infrastructure development, American companies may find themselves forced to locate AI development operations in Chinese facilities to access necessary computing resources. This infrastructure dependence could compromise American technological sovereignty and create national security vulnerabilities that persist for decades.

Pillar 1 — The Federal Lands Strategy for Gigawatt-Scale AI Training Facilities

Anthropic’s first major policy recommendation centers on leveraging federal land ownership to accelerate gigawatt-scale AI training facility development. By utilizing Department of Defense, Department of Energy, and Bureau of Land Management properties, the federal government can bypass state and local zoning restrictions while accessing existing environmental analyses that streamline approval processes.

The federal lands approach offers multiple advantages for AI infrastructure development. Federal properties often have existing transmission infrastructure, cleared environmental reviews, and security protocols appropriate for sensitive AI research activities. Military bases, in particular, offer large contiguous areas with existing power infrastructure and security arrangements that align with the requirements of frontier AI development.

Bureau of Land Management territories provide additional opportunities, particularly in western states with abundant solar and geothermal resources. Roughly 40 gigawatts of hydrothermal power is available on western BLM lands, representing a natural match for AI training facilities that require stable, renewable energy sources. These locations also offer the geographic isolation that some AI researchers prefer for security and operational reasons.

The legal framework for federal land leasing provides mechanisms to accelerate development timelines. Federal sovereign immunity can shield projects from state and local regulatory processes, while existing environmental analyses for renewable energy projects can be leveraged to support AI facility approvals. This approach could compress multiyear approval processes into months while maintaining appropriate environmental and security oversight.

Accelerating NEPA — Programmatic Reviews, Categorical Exclusions, and National Security Exemptions

Reforming the National Environmental Policy Act implementation for AI infrastructure requires targeted legal strategies that balance environmental protection with national security imperatives. Anthropic proposes three specific approaches: programmatic environmental impact statements, categorical exclusions for certain AI facilities, and national security exemptions for critical infrastructure projects.

Programmatic environmental reviews represent the most immediately actionable approach. Rather than conducting separate environmental analyses for each AI facility, federal agencies can develop comprehensive programmatic reviews that address the environmental impacts of AI infrastructure development across multiple sites and projects. This approach, successfully used for renewable energy development, enables individual projects to “tier off” the broader analysis, dramatically reducing approval timelines.

Categorical exclusions offer another pathway for streamlined approvals. AI data centers that meet specific criteria—such as location on previously developed federal lands, use of renewable energy sources, and compliance with predetermined environmental standards—could qualify for categorical exclusion from full environmental impact statement requirements. This approach maintains environmental oversight while eliminating redundant analyses for low-risk projects.

National security exemptions provide the most direct mechanism for urgent projects. While politically sensitive, the legal authority exists under various federal statutes to exempt critical infrastructure projects from standard NEPA procedures when national security interests are at stake. The key challenge lies in establishing clear criteria for when such exemptions are appropriate and ensuring that environmental considerations are addressed through alternative mechanisms.

Convert regulatory reports and policy documents into engaging interactive experiences that help stakeholders understand complex requirements

Get Started →

Fixing Transmission — DOE Partnership Authorities and the Case for Federal Siting Power

Transmission infrastructure represents perhaps the most complex regulatory challenge for AI infrastructure development, requiring coordination across multiple state jurisdictions with different regulatory frameworks, environmental requirements, and political priorities. Anthropic proposes leveraging existing federal authorities to streamline transmission development while building the case for expanded federal siting power.

Section 1222 of the Energy Policy Act provides immediate opportunities for federal transmission development. This authority enables the Department of Energy to partner with private entities to develop transmission infrastructure on federal lands, bypassing state regulatory processes that can delay projects for years. For AI facilities located on federal properties, this authority could enable dedicated transmission lines that connect directly to major grid interconnection points without navigating multiple state approval processes.

Section 40106 offers additional federal transmission authority, particularly for projects that cross state boundaries or connect to federal facilities. This authority has been underutilized but could provide critical legal framework for the interstate transmission infrastructure required to support geographically distributed AI development. The key challenge lies in developing standardized procedures that leverage these authorities consistently across different regions and project types.

Federal sovereign immunity provides the most powerful tool for bypassing state regulatory bottlenecks. When transmission projects are developed under federal authority on federal lands, state regulatory agencies lose jurisdiction over key approval processes. This approach could enable transmission development timelines measured in months rather than years, particularly for projects that serve federal facilities or support national security infrastructure.

The Interconnection Crisis — Cutting Utility Timelines from Six Years to Two

Utility interconnection procedures represent the final regulatory bottleneck for AI infrastructure development, with current timelines of 4-6 years that reflect utilities’ conservative approach to grid stability and their limited experience with AI facility power characteristics. Anthropic proposes several targeted reforms to compress these timelines while maintaining grid reliability and safety standards.

Queue auctions could replace the current first-come, first-served interconnection process with market-based mechanisms that prioritize projects with the highest economic value and strongest development commitment. This approach would reduce speculative interconnection requests while ensuring that legitimate AI development projects receive priority consideration. The challenge lies in designing auction mechanisms that balance speed, cost, and technical feasibility considerations.

Automated resilience testing using AI-powered software tools could dramatically accelerate the technical studies required for interconnection approvals. Current manual analysis processes require months of engineering work to assess grid impacts and design appropriate interconnection facilities. Machine learning systems trained on historical interconnection studies could complete these analyses in days or weeks while maintaining accuracy and safety standards.

Peak-use consumption limits offer another approach to streamlined interconnections. AI facilities that commit to operating below specified power consumption levels during peak grid demand periods could qualify for expedited interconnection processes. This approach addresses utilities’ primary concern about grid stability while enabling AI facilities to operate at full capacity during off-peak hours when grid capacity is available. Smart grid technologies make this type of dynamic load management increasingly feasible and reliable.

The Nuclear Option — Defense Production Act and Federal Power Act as Last-Resort Authorities

When voluntary cooperation and regulatory reform prove insufficient, federal authorities provide legal mechanisms to compel infrastructure development for national security purposes. The Defense Production Act and Federal Power Act offer powerful but politically sensitive tools that could override state and local resistance to critical AI infrastructure projects.

Defense Production Act Title I provides authority to require businesses to prioritize contracts and orders necessary for national defense, including critical infrastructure components. For AI infrastructure development, this authority could compel utilities to prioritize interconnection processes, require suppliers to fulfill orders for critical grid components despite long lead times, and mandate that construction companies prioritize AI facility development over less critical projects.

Section 202(c) of the Federal Power Act grants the federal government authority to require utilities to construct or modify facilities during national emergencies. While traditionally used for temporary emergency measures, this authority could potentially support permanent infrastructure development when AI capabilities are deemed essential for national security. The legal and political challenges of using this authority would be significant, making it truly a last resort option.

The practical implementation of these authorities would require careful legal and political preparation. Federal agencies would need to develop clear criteria for when national security interests justify the use of extraordinary powers, establish procedures for minimizing disruption to state and local governance, and create mechanisms for addressing legitimate environmental and community concerns even when normal regulatory processes are bypassed.

Pillar 2 — Building Nationwide Infrastructure for AI Deployment and Inference

While gigawatt-scale training facilities represent the most concentrated infrastructure challenge, AI deployment for inference applications requires distributed infrastructure development across the entire United States. This nationwide approach demands different regulatory strategies that work within existing state and local frameworks while accelerating approval processes for the thousands of smaller AI facilities required for national AI deployment.

National Interest Electric Transmission Corridors provide a mechanism for federal coordination of transmission infrastructure that supports AI deployment across multiple states. By designating transmission corridors that serve national AI deployment objectives, federal agencies can coordinate planning efforts, streamline environmental reviews, and provide financing mechanisms that support multi-state infrastructure projects. This approach has been successfully used for renewable energy development and could be adapted for AI infrastructure requirements.

Geothermal acceleration represents a particularly promising approach for distributed AI infrastructure. Unlike solar and wind resources that require extensive transmission infrastructure to reach load centers, geothermal resources are often located near suitable sites for AI facilities. Federal policies that accelerate geothermal development—including streamlined permitting on federal lands and risk-sharing mechanisms for exploration activities—could enable AI facilities to access clean, reliable baseload power without straining existing transmission infrastructure. Organizations planning renewable energy data center deployments can benefit from understanding these geothermal development frameworks.

Nationwide Clean Water Act permits could address one of the most time-consuming aspects of data center development: cooling system approvals. By developing programmatic permits that address the water use and discharge requirements of AI facilities, federal agencies could eliminate months of site-specific permitting processes while maintaining appropriate environmental protections. This approach would be particularly valuable for facilities that use advanced cooling technologies or closed-loop systems that minimize environmental impacts.

Transform your infrastructure planning documents and policy reports into interactive presentations that drive stakeholder engagement

Start Now →

The 2028 Countdown — Notional Timelines from Site Selection to Operational Data Centers

Anthropic’s detailed timeline analysis reveals the compressed schedule required to meet 2028 AI infrastructure targets. The timeline begins with immediate site selection in July 2025 and extends through early 2028 completion targets for integrated facilities that include data centers, power generation, and transmission infrastructure. Every month of delay in the approval process compounds across the entire timeline, making early action critical for meeting national AI infrastructure objectives.

The site selection phase, targeted for completion by July 2025, requires rapid evaluation of candidate federal properties based on power availability, transmission access, security requirements, and environmental considerations. This phase must balance technical requirements with political feasibility, as site selections will determine which states and communities benefit from federal AI infrastructure investments. Early engagement with state and local officials during this phase could prevent later political obstacles that delay project development.

Environmental review completion by June 2026 represents the most challenging milestone in the timeline. Achieving this target requires implementing the NEPA acceleration strategies outlined in Anthropic’s recommendations: programmatic reviews, categorical exclusions, and coordinated agency processes. The alternative—traditional project-by-project environmental reviews—would extend timelines well beyond 2028 and potentially compromise America’s AI infrastructure objectives.

Construction completion by early 2028 assumes that regulatory approvals, environmental reviews, and financial commitments are in place by mid-2026. The construction phase itself requires unprecedented coordination between data center development, power generation installation, transmission line construction, and cooling system implementation. Supply chain considerations are particularly critical, given the three-year lead times for critical grid components and the limited domestic manufacturing capacity for specialized AI infrastructure equipment. Supply chain security initiatives must therefore begin immediately to support the 2028 timeline, potentially requiring Defense Production Act authorities to prioritize domestic manufacturing capacity for critical components.

Frequently Asked Questions

Why does Anthropic say America needs 50 gigawatts of AI infrastructure by 2028?

Anthropic projects that training a single state-of-the-art AI model will require 5GW data centers by 2028-2030. With 20-25GW needed for frontier AI training across multiple locations, the US must build at least 50GW total capacity to maintain AI leadership against China’s rapidly expanding 400GW+ infrastructure buildout.

What are the main regulatory barriers slowing US AI infrastructure development?

Three major bottlenecks: 1) Environmental reviews under NEPA that take years instead of months, 2) Transmission line approvals averaging 4-6 years with only 55 miles built in 2023 vs 1,700 miles annually in 2010-2014, and 3) Utility interconnection processes that require 4-6 years instead of China’s 3-6 month timelines.

How does China’s AI infrastructure buildout compare to America’s capacity?

China brought over 400GW of new generation capacity online last year, while the US added roughly one-tenth that amount. China’s construction permitting takes 3-6 months compared to America’s years-long processes, giving China a significant infrastructure development advantage in the AI race.

What federal solutions does Anthropic propose to accelerate AI infrastructure?

Two main pillars: 1) Lease federal lands (DOD/DOE/BLM) for gigawatt-scale training facilities with streamlined environmental reviews and federal transmission authority, and 2) Nationwide permitting reform using NEPA categorical exclusions, automated utility processes, and National Interest Electric Transmission Corridors.

What emergency powers could be used if voluntary measures fail?

The Defense Production Act Title I for requiring critical infrastructure components and Section 202(c) of the Federal Power Act for mandating utility interconnections during national emergencies. These provide last-resort authority to override state and local barriers for national security purposes.

Your documents deserve to be read.

PDFs get ignored. Presentations get skipped. Reports gather dust.

Libertify transforms them into interactive experiences people actually engage with.

No credit card required · 30-second setup