If you have been following Anthropic’s growth and wondering why their models sometimes go down or response times slow under heavy load, Dario Amodei gave the clearest answer yet on Wednesday.
Speaking at Anthropic’s developer conference in San Francisco on May 6, the CEO said the company had planned for roughly ten times growth in the first quarter of 2026. Instead, on an annualized basis, usage and revenue grew eighty times over. “That is the reason we have had difficulties with compute,” Amodei told the audience. “You can’t plan for 80x.”
It is a remarkable problem to have — and it tells us something important about where enterprise AI actually is right now.
The Numbers Behind the Crunch
Anthropic’s compute shortfall is not a story about bad planning. It is a story about how fast enterprise adoption of AI has accelerated beyond what even the best-positioned insiders expected.
The growth trajectory tells the story: Anthropic was running at roughly $1 billion in annualized revenue at the start of 2025. By the end of 2025, that had reached $9 billion. By April 2026, the company crossed $30 billion in annualized run-rate revenue — surpassing OpenAI for the first time in both companies’ histories.
More than 1,000 businesses are now spending at least $1 million a year with Anthropic. That number doubled in under two months after the company’s Series G announcement in February. Roughly 70 percent of the Fortune 100 use Claude in some capacity. And Claude Code, the AI coding tool, has grown to roughly $2.5 billion in annualized revenue within months of general availability.
These are not incremental adoption numbers. These are structural shifts in how businesses are spending on software and AI.
Why Compute Is Now the Constraint
The AI industry has spent three years talking about model quality as the key differentiator. The conversation is shifting.
When every major AI vendor has models that are genuinely capable, the question businesses now ask is: can you actually deliver at enterprise scale, consistently, with predictable pricing?
Anthropic’s compute crunch highlights a tension that will shape the market over the next twelve to eighteen months. Enterprise demand has accelerated faster than infrastructure can follow. The company is moving aggressively to address this — signing a deal with SpaceX to access all available compute at the Colossus 1 data center in Memphis (more than 300 megawatts of capacity), and locking in a multi-billion dollar infrastructure arrangement with Amazon. These are not small bets. They are the kinds of capital commitments that signal Anthropic expects demand to keep running ahead of supply for the foreseeable future.
What This Means for Businesses Choosing AI Partners
If you are a business leader evaluating AI vendors or expanding your current usage, the Anthropic compute story carries a few practical implications.
Demand is real, not hype. An 80x growth figure at this scale is not marketing language. It reflects actual enterprise contracts, actual API usage, and actual spending decisions by real companies. The AI adoption wave is past the “pilot project” stage for most enterprise buyers.
Supply constraints will affect pricing and availability. When demand outpaces infrastructure, prices tend to stay elevated or increase. Businesses that lock in enterprise agreements now are better positioned than those who wait until compute economics stabilize.
Enterprise-first design matters. Anthropic’s revenue split is roughly 80 percent enterprise, 20 percent consumer. That composition reflects a deliberate choice to build for reliability, security, and governance requirements that enterprise buyers demand. For businesses looking for AI that will actually work in production workflows — not just demos — that enterprise-first orientation is worth understanding when choosing a vendor.
The enterprise AI race is not over, but it is concentrating. The large majority of Fortune 100 companies are now Claude customers. That installed base creates switching costs and integration depth that compounds over time. A few large-scale vendors are building meaningful separation from the rest of the field.
The EDNA Perspective
At Enterprise DNA, we track AI adoption closely because our work sits at the intersection of the two things the current market values most: deep data expertise and AI deployment capability.
What the Anthropic numbers confirm is something we have been telling business clients for months. The window where companies can adopt AI on a casual, exploratory basis and still catch up is narrowing. Enterprise AI spend has moved from discretionary line items to core infrastructure budgets at companies like JPMorgan, which recently reclassified its AI investments alongside cybersecurity and payment systems.
The businesses that treated AI seriously in 2024 are seeing compounding returns in 2025 and 2026. The businesses still waiting for the “right time” are watching the gap widen.
If you are not sure where to start, or you have started but are not seeing the outcomes you expected, that is exactly what Omni Advisory exists to help with. We work with business leaders to build AI strategies that are grounded in real operational context — not vendor demos and conference slides.
The compute crunch at Anthropic is not a problem for most businesses to solve directly. But it is a signal worth paying attention to: the companies deploying AI at scale are running out of capacity because they built something that works. That is the bar your AI strategy needs to reach.
Want to understand how AI can actually work inside your business operations? Book a discovery call with the Enterprise DNA team and we will walk through what a practical AI deployment looks like for your context.
Source
CNBC