Introduction
Welcome back to Laboratory. This week we take a look at a new report by Epoch AI, David Owen, titled “What will AI look like in 2030?”. Published on Sep 16, 2025, it shows power diagrams: if AI scaling holds through 2030, we are looking at $100B training clusters, gigawatt draws, and 10^29 FLOP runs that turn today’s demos into working scientific assistants. The bet is simple and radical: capex buys capability, capability buys revenue, and the loop funds the next rung up the scale ladder.
In this briefing, we’ll chart:
Scale-as-Strategy: Why continued revenue growth and benchmark trends make hundred-billion training investments rational, and why training and inference are likely to co-scale rather than crowd each other out.
Power is Product: How securing gigawatt-class supply, distributed training across sites, and solar-plus-batteries or off-grid gas turns energy procurement into a core moat.
Data Engines, Not Datasets: Human corpora to ~2027, then synthetic data and reasoning-aligned curricula as the new flywheel for model quality.


