The Models Layer
The frontier AI model companies — OpenAI, Anthropic, xAI — their economics, growth trajectories, and how investors gain exposure.
The companies building frontier AI models — OpenAI, Anthropic, xAI — are mostly private and not directly investable through public markets. Understanding their economics matters because it shapes demand across the rest of the AI stack.
Key Observations
Revenue Is Real
Anthropic reached a $30B+ revenue run rate, up from $9B at end of 2025. This growth reflects genuine enterprise willingness to pay for AI capabilities, not speculative hype. The revenue trajectory is steep and accelerating as more organisations integrate AI into their workflows.
Lean Operating Structures
Frontier model companies operate with small headcounts — typically 5,000 to 8,000 employees. This means manageable fixed operating costs. Current losses are not driven by bloated headcount but by server costs for training, research, and inference. This cost structure is fundamentally different from previous technology waves where losses came from unsustainable hiring.
The Inference Shift
As models improve and inference scales, the serving side of the business will grow much larger than the training side. Training is a periodic, large upfront cost. Inference is a continuous, recurring revenue stream that scales with user adoption.
Since most frontier model companies are private, public market investors gain exposure through the infrastructure companies that provide the compute these models run on. Every dollar of model company revenue flows partly into hyperscaler and chip company revenue.
The Broadband Comparison Is Wrong
Sceptics draw parallels to the dot-com bubble, comparing AI infrastructure spending to the overbuild of broadband capacity in the late 1990s. This comparison misses a structural difference: the bull case for AI does not rest on speculative future demand. It rests on inference becoming a large, profitable, recurring revenue business — and the revenue is already materialising at scale.
The most important metric for the models layer is not training cost or benchmark performance. It is revenue growth and the trajectory of inference demand. These indicate whether AI is crossing from research curiosity to essential business tool.
Related
- AI Stack Framework — Where the models layer sits in the broader value chain
- The Infrastructure Layer — The hyperscalers providing compute for model training and inference
- Inference Scaling — The economics of inference and why it drives recurring revenue