Build AI in America: Anthropic’s Blueprint for Energy Infrastructure and National Security
Table of Contents
- Why Building AI Infrastructure in America Is a National Security Imperative
- The 50-Gigawatt Challenge: How Much Energy Does AI Actually Need by 2028?
- Training vs. Inference: Two Very Different Infrastructure Problems
- Why America’s Permitting System Is Failing the AI Buildout
- How China Is Outpacing the U.S. in Energy Infrastructure Development
- Federal Lands: The Untapped Solution for AI Data Center Siting
- Cutting Through NEPA: Accelerating Environmental Reviews Without Abandoning Them
- Fixing America’s Broken Transmission Grid for the AI Era
- The Interconnection Bottleneck: Getting Power to Data Centers Faster
- From Geothermal to Nuclear: The Energy Mix Powering America’s AI Future
- Securing the Supply Chain: Transformers, Cybersecurity, and Strategic Reserves
- A Timeline for Action: What Needs to Happen by When to Meet 2028 Targets
Key Takeaways
- 50 GW target: America’s AI sector needs at least 50 gigawatts of electric capacity by 2028 to maintain global leadership
- Executive action: The federal government already has legal authority to accelerate this buildout without new Congressional legislation
- Federal lands strategy: Using DOD/DOE land near BLM energy resources can bypass state zoning while maintaining environmental review
- China challenge: China added 400+ GW of power capacity last year versus roughly 40 GW in the U.S. – a 10:1 gap
- Interconnection crisis: Approval processes taking 4-6 years could be reformed through queue auctions and automated testing
- Supply chain risks: 3-year lead times for transformers and cybersecurity vulnerabilities in imported components threaten AI infrastructure
Why Building AI Infrastructure in America Is a National Security Imperative
America is facing its most consequential infrastructure challenge since the Interstate Highway System. But instead of connecting cities with asphalt, we need to power artificial intelligence with electricity – and the stakes couldn’t be higher for national security and economic competitiveness.
Anthropic’s comprehensive policy report frames AI infrastructure as fundamentally about maintaining democratic nations’ military and technological advantages in an increasingly competitive world. This isn’t just about corporate profits or technological prestige – it’s about ensuring that the most powerful AI systems are developed and controlled by countries that share our values.
The report positions energy as the ultimate bottleneck for AI leadership, arguing that current regulatory frameworks simply cannot deliver the massive scale of infrastructure needed quickly enough. While China’s coordinated approach to infrastructure development gives it significant advantages, America can leverage its unique strengths: federal lands, regulatory reform authority, and technological innovation.
The urgency is real and measurable. As Anthropic notes, preconstruction approvals must be cleared by 2026 for systems to operate by 2028. Every month of delay narrows the window for maintaining U.S. leadership in frontier AI development.
Explore how different countries approach AI infrastructure development and what America can learn from global best practices.
The 50-Gigawatt Challenge: How Much Energy Does AI Actually Need by 2028?
The numbers are staggering and growing rapidly. Anthropic projects that America’s AI sector will require at least 50 gigawatts of electric capacity by 2028. To put this in perspective, that’s roughly equivalent to the output of 50 large nuclear power plants or the entire current electricity demand of several states combined.
This estimate has grown dramatically as AI capabilities have scaled faster than expected. In 2023, global forecasts predicted only 14-19 GW of AI power demand by 2028. Now, external forecasters are projecting much higher numbers:
- Semianalysis: 80 GW globally by 2028, with 56 GW in the U.S.
- RAND Corporation: 117 GW globally by 2028
- Lawrence Berkeley National Lab: 74-132 GW for all U.S. data centers by 2028
The scale becomes even more dramatic when you consider individual training runs. Anthropic anticipates using 2 GW data centers in 2027 and 5 GW data centers in 2028 for single training runs. A 5 GW facility would be larger than America’s biggest nuclear plants, which max out at 4-4.5 GW capacity.
What makes this challenge particularly complex is that it’s not just about training models. Inference – actually running AI systems for users – is expected to require “roughly as much or more” compute and energy than training. This means the 50 GW figure might be conservative.
The distribution matters too. The U.S. accounts for almost half of global data center power use, making American infrastructure policy decisions globally consequential for AI development.
Training vs. Inference: Two Very Different Infrastructure Problems
One of the most insightful aspects of Anthropic’s analysis is recognizing that AI infrastructure actually encompasses two distinct challenges that require different solutions: concentrated gigawatt-scale facilities for training versus distributed networks for deployment.
Training infrastructure needs massive concentrations of power and compute in single locations. These facilities require:
- 5+ GW of power in a single location by 2028-2030
- Extremely high reliability and low latency connections between components
- Specialized cooling and power distribution systems
- Access to high-voltage transmission infrastructure
Inference infrastructure has different requirements:
- Distributed across many smaller facilities closer to users
- Lower power per site but much broader geographic coverage
- Different latency and reliability requirements
- Integration with existing cloud infrastructure
This distinction is crucial for policy because it means different regulatory approaches, siting strategies, and environmental considerations apply. Training facilities can be located in remote areas with abundant renewable energy, while inference infrastructure needs to be distributed near population centers.
Understand how training and inference infrastructure requirements shape global AI competitiveness and policy decisions.
Why America’s Permitting System Is Failing the AI Buildout
The core problem isn’t technology or financing – it’s the regulatory maze that can add years to infrastructure projects. America’s overlapping federal, state, and local permitting processes create what Anthropic describes as “yearslong delays that threaten U.S. competitiveness.”
The statistics paint a sobering picture:
- Interconnection approvals typically take 4-6 years for generation resources
- Transmission projects average more than 10 years to complete since 2005
- Only 55 miles of high-voltage transmission lines were built in 2023, versus 1,700 miles annually from 2010-2014
- NEPA environmental impact statements can take multiple years
Meanwhile, China’s construction permitting timelines are 3-6 months versus years in the U.S. This isn’t necessarily about adopting China’s approach (which often sacrifices environmental protection and community input), but it illustrates the urgency of reform.
The report identifies several specific bottlenecks:
- Interconnection queues: First-come-first-served systems that reward speculative projects over serious developers
- Environmental review duplication: Multiple agencies conducting separate reviews of the same project
- State/local zoning conflicts: Local opposition that can kill projects regardless of federal support
- Grid planning coordination: Utilities making decisions in isolation rather than coordinating regionally
What makes this particularly frustrating for AI companies is that many of these delays are procedural rather than substantive. Projects that would ultimately be approved still face years of regulatory uncertainty.
How China Is Outpacing the U.S. in Energy Infrastructure Development
The comparison with China provides sobering context for the scale of America’s infrastructure challenge. China brought over 400 GW of new generation capacity online last year – roughly ten times what the U.S. added (excluding storage) in 2024.
China’s advantages in AI infrastructure development include:
- Centralized planning: The Eastern Data Western Computing Plan invested $6.1 billion in data centers over two years with coordinated siting and power development
- Streamlined permitting: Construction permits in 3-6 months versus years in the U.S.
- State coordination: Provincial governments compete to attract data centers with streamlined processes and infrastructure commitments
- Integrated energy planning: Power generation and transmission developed together rather than separately
However, Anthropic explicitly notes that the U.S. should not simply copy China’s approach. China’s system sacrifices environmental protection, community input, and worker safety that Americans value. The challenge is achieving speed without abandoning principles.
The report argues that America’s federal system actually provides unique advantages if properly leveraged:
- Federal lands: Direct federal control over siting decisions for 640 million acres
- Executive authority: Existing legal tools that don’t require new legislation
- Technology leadership: More efficient use of energy through superior AI algorithms and hardware
- Private sector efficiency: Market-driven innovation in energy and computing technologies
The key is coordination and urgency, not abandoning democratic processes or environmental protection.
Federal Lands: The Untapped Solution for AI Data Center Siting
Perhaps the most innovative recommendation in Anthropic’s report is leveraging federal lands to bypass state and local zoning challenges. This strategy could unlock massive energy resources while maintaining environmental review.
The federal government controls about 640 million acres – roughly 28% of all U.S. land. Much of this contains abundant renewable energy resources:
- BLM lands: Over 30 million acres covered by the 2024 Solar Programmatic EIS, ready for accelerated development
- Geothermal resources: Roughly 40 GW of hydrothermal power accessible on western BLM lands
- Military installations: Large secure areas with existing power infrastructure
- DOE facilities: Sites with specialized power and security infrastructure
The strategy involves co-locating data centers on federal land (particularly DOD/DOE sites) near BLM land available for power generation. This bypasses the most problematic bottleneck – state and local zoning authority that the federal government cannot control – while using NEPA environmental review that it can accelerate.
Specific advantages of this approach:
- Zoning certainty: Federal land use decisions override local zoning restrictions
- Security benefits: Military installations provide physical security for critical AI infrastructure
- Existing infrastructure: Many federal sites already have high-voltage transmission connections
- Environmental streamlining: Existing programmatic environmental reviews can be leveraged
The federal lands strategy isn’t without challenges. It requires coordination between multiple federal agencies, may face political opposition in affected states, and still needs substantial transmission infrastructure investment. But it provides a path forward that doesn’t depend on state and local cooperation.
Discover how different approaches to land use and energy policy affect national competitiveness in emerging technologies.
Cutting Through NEPA: Accelerating Environmental Reviews Without Abandoning Them
The National Environmental Policy Act (NEPA) represents both a critical environmental protection and a major procedural bottleneck. Anthropic’s report proposes several reforms to accelerate NEPA review without abandoning environmental protection:
Programmatic Environmental Impact Statements: Rather than reviewing each project individually, agencies can conduct broad environmental reviews for entire categories of projects. The BLM’s 2024 Solar Programmatic EIS already covers over 30 million acres and could be extended to data center development.
Categorical exclusions: For projects that fit within pre-approved environmental parameters, agencies can use streamlined categorical exclusions rather than full environmental impact statements.
Existing environmental documents: Leveraging previous environmental reviews rather than starting from scratch for each project.
Time limits: Setting clear deadlines for agency decision-making rather than allowing open-ended review processes.
The report emphasizes that this isn’t about weakening environmental protection but about making the process more efficient and predictable. Environmental review would still occur, but within defined timelines and using existing analysis where appropriate.
Key principles for NEPA reform include:
- Front-load analysis: Comprehensive upfront environmental review that covers multiple future projects
- Agency coordination: Single environmental review shared across federal agencies rather than duplicative reviews
- Public engagement: Maintaining meaningful public input while avoiding indefinite delays
- Adaptive management: Ability to modify projects based on monitoring and new information
Fixing America’s Broken Transmission Grid for the AI Era
Even with faster permitting and better siting, America’s transmission grid remains a fundamental constraint on AI infrastructure development. The numbers are stark: only 55 miles of high-voltage transmission lines were built in 2023, versus 1,700 miles annually from 2010-2014.
The transmission challenge has several dimensions:
Aging infrastructure: Much of America’s grid was built decades ago for a different energy system with centralized fossil fuel plants rather than distributed renewables and massive data centers.
Regional planning gaps: Utilities plan transmission within their service territories, but AI infrastructure needs regional and national coordination.
Cost allocation disputes: Who pays for transmission upgrades – ratepayers, developers, or taxpayers – remains contentious and slows projects.
Technology gaps: Grid infrastructure designed for predictable baseload power may struggle with variable renewable generation and massive, concentrated AI loads.
Anthropic proposes several transmission reforms:
- DOE Section 1222 authority: Using existing federal authority to build transmission in National Interest Electric Transmission Corridors
- Regional planning mandates: Requiring utilities to plan transmission regionally rather than individually
- Cost allocation clarity: Clear rules that transmission costs should be allocated to developers rather than ratepayers
- Technology upgrades: Investing in high-capacity transmission technologies and smart grid infrastructure
The report notes that DOE already has $3.25 billion in Section 402 authority and $2.5 billion in Section 40106 authority for transmission lending, providing financial tools to accelerate critical projects.
The Interconnection Bottleneck: Getting Power to Data Centers Faster
Perhaps nowhere is the dysfunction of America’s energy system more apparent than in interconnection queues – the process by which new generation resources connect to the grid. Current first-come-first-served systems create perverse incentives that reward speculative projects over serious developers.
The problems are systemic:
- Queue clogging: Speculative projects hold positions in interconnection queues without serious development plans
- Study delays: Interconnection studies that should take months can stretch to years
- Serial processing: Projects studied individually rather than in clusters, multiplying time and cost
- Upgrade costs: Unexpected grid upgrade requirements that can make projects uneconomical
Anthropic proposes several reforms that utilities can implement:
Queue auctions: Rather than first-come-first-served, use competitive auctions that reward projects with serious development timelines and financial commitments.
Automated testing: Streamlined technical studies that can be completed quickly for straightforward projects.
Peak-use agreements: Allow data centers to connect with agreements to curtail usage during peak demand periods, reducing grid upgrade requirements.
Cluster studies: Evaluate multiple projects together to identify shared infrastructure needs and costs.
The report even suggests that if utilities won’t reform interconnection processes voluntarily, the President could invoke Defense Production Act authorities to require critical AI infrastructure connections. This signals how seriously Anthropic views interconnection as a national security bottleneck.
From Geothermal to Nuclear: The Energy Mix Powering America’s AI Future
One of the most pragmatic aspects of Anthropic’s report is its “all of the above” approach to energy procurement. Rather than betting on a single technology, the report advocates for diversified energy sources that can deliver power quickly and reliably.
Solar and batteries are positioned as likely to be most economically efficient in the near term, particularly on federal lands with good solar resources and existing environmental review.
Geothermal emerges as a particularly promising option, with roughly 40 GW of hydrothermal power accessible on western BLM lands. Geothermal provides constant baseload power without intermittency issues, making it ideal for data centers.
Natural gas is acknowledged as likely necessary for near-term deployment, though the report emphasizes the importance of environmental review and community engagement.
Advanced nuclear is seen as potentially transformative in the longer term but likely not available at scale by 2028. The report notes that America’s largest nuclear plants max out at 4-4.5 GW capacity, smaller than the 5 GW single-site requirements Anthropic anticipates by 2028-2030.
The key principle is technology-neutral procurement that optimizes for speed, reliability, and cost rather than specific energy sources. Anthropic explicitly warns against restricting power procurement to particular technologies, noting this could slow deployment and increase costs.
The report also addresses the question of behind-the-meter generation – data centers generating their own power without connecting to the grid. While this might work for smaller facilities, physical and regulatory constraints make fully behind-the-meter generation impractical for 5 GW frontier training facilities.
Securing the Supply Chain: Transformers, Cybersecurity, and Strategic Reserves
America’s dependence on imported energy infrastructure components creates both supply chain vulnerabilities and cybersecurity risks that could undermine AI infrastructure development. The report identifies several critical concerns:
Manufacturing lead times: Domestic transformer and circuit breaker production has lead times of approximately 3 years, creating bottlenecks for rapid infrastructure deployment.
Import dependence: Much critical grid infrastructure is manufactured overseas, creating supply chain vulnerabilities and potential cybersecurity risks.
Cybersecurity threats: “Backdoors” in Chinese-made inverters and substations could create vulnerabilities in the physical infrastructure underlying AI systems.
Workforce shortages: Critical shortages of electricians and other skilled workers needed for infrastructure construction and maintenance.
Anthropic proposes several supply chain security measures:
- Strategic reserves: Government-maintained strategic reserves of transformers and circuit breakers to respond to emergencies or supply disruptions
- Domestic manufacturing incentives: Loan guarantees and other support for domestic manufacturers of critical grid components
- Cybersecurity requirements: Mandatory security standards for imported energy infrastructure components
- Workforce development: Training programs for electricians and other critical infrastructure workers
The cybersecurity dimension is particularly important given the national security framing of AI infrastructure. Compromised grid components could potentially be used to disrupt AI training or steal sensitive data, making supply chain security a critical component of overall AI security strategy.
A Timeline for Action: What Needs to Happen by When to Meet 2028 Targets
The most striking aspect of Anthropic’s report is its detailed timeline showing how little time remains to enable 2028 AI infrastructure targets. The window for action is closing rapidly:
2025 Actions:
- Data center site selection (July-December 2025)
- Begin NEPA programmatic review for data centers (July 2025)
- Launch interconnection queue reforms
- Initiate federal lands leasing processes
2026 Deadlines:
- Complete NEPA programmatic review (by June 2026)
- Clear all preconstruction approvals (by end of 2026)
- Begin data center and power infrastructure construction
- Complete transmission planning and permitting
2027-2028 Construction:
- Solar construction (July 2026-January 2028)
- Data center construction complete (early 2028)
- Grid interconnection and testing
- System commissioning and operation
The report emphasizes that preconstruction approvals must be cleared by 2026 for systems to operate by 2028. With construction requiring up to two years, the effective deadline for starting the approval process is now.
This timeline creates artificial but useful urgency. By anchoring policy recommendations to specific AI scaling trajectories and infrastructure lead times, the report transforms abstract policy debates into concrete deadlines with measurable milestones.
The report acknowledges that some recommendations would be “difficult or costly to exercise” and frames itself as informing deliberation rather than prescribing a single path. However, the timeline makes clear that delaying action effectively means accepting that America may not have the infrastructure needed for frontier AI development by 2028.
Frequently Asked Questions
How much energy will America’s AI sector need by 2028?
According to Anthropic’s report, the U.S. AI sector is on track to require at least 50 gigawatts of electric capacity by 2028. This includes 20-25 GW for frontier AI training alone, with inference requiring roughly the same amount or more. To put this in perspective, that’s equivalent to about 50 large nuclear power plants.
What is the federal lands strategy for AI data centers?
The strategy involves siting data centers on federal land (DOD/DOE) near BLM land available for power generation. This bypasses state and local zoning (which the federal government can’t control) while using NEPA review (which it can accelerate). BLM lands contain roughly 40 GW of accessible geothermal power.
How does China’s energy infrastructure development compare to the U.S.?
China brought over 400 GW of new generation capacity online last year versus roughly 40 GW in the U.S. – a 10:1 gap. China’s construction permitting takes 3-6 months compared to years in the U.S., and they invested $6.1 billion in data centers through their Eastern Data Western Computing Plan.
What are the main bottlenecks in U.S. AI infrastructure development?
Key bottlenecks include: interconnection approvals taking 4-6 years, transmission projects averaging 10+ years since 2005, NEPA environmental reviews taking multiple years, 3-year lead times for transformers and circuit breakers, and overlapping federal, state, and local permitting processes.
Can the executive branch solve this without new legislation?
Yes, according to Anthropic’s analysis, the executive branch already has legal authority to enable most of the needed buildout. Tools include DOE’s Section 1222 transmission authority, NEPA reform, federal lands leasing, Defense Production Act for critical bottlenecks, and interconnection queue reforms.
What energy sources does Anthropic recommend for AI data centers?
Anthropic advocates for an “all of the above” energy approach, explicitly warning against restricting procurement to specific sources. They suggest solar, batteries, and geothermal may be most economically efficient before advanced nuclear comes online, but emphasize technology-neutral procurement.
Transform Complex Reports into Interactive Experiences
Turn dense policy documents and research reports into engaging, searchable experiences that help your audience understand and act on complex information.