Anthropic has signed a $1.8 billion computing deal with Akamai Technologies spanning seven years, according to Bloomberg. The deal gives Anthropic access to Akamai’s distributed infrastructure across 4,300 locations in 700 cities and 130 countries for AI inference workloads. Akamai stock surged 20% on the announcement.
What the Deal Covers
Akamai CEO Tom Leighton confirmed the commitment in the company’s Q1 earnings press release on Thursday, describing the customer as a “leading frontier model provider” without naming Anthropic directly. Bloomberg identified the provider as Anthropic, citing people familiar with the matter.
“We operate the world’s most distributed platform, and we have our infrastructure in 4,300 places, 700 cities in 130 countries,” Leighton told CNBC. “Now we’re using it to support AI so our customers, agents and AI apps can run right near their users, and the data provide a much faster experience.”
Akamai’s cloud infrastructure revenue grew 40% year over year to $95 million in Q1, according to CNBC. Total quarterly revenue rose 6% to over $1 billion. The company projected Q2 revenue between $1.08 billion and $1.1 billion.
Third Major Infrastructure Deal in Two Weeks
The Akamai commitment marks Anthropic’s third major infrastructure deal disclosed this month. On May 7, Anthropic confirmed a partnership with SpaceX to use all compute capacity at the Colossus 1 data center, a 300MW facility with 220,000 GPUs. That same day, reporting confirmed Anthropic had committed to spending $200 billion on Google Cloud services and TPU capacity over five years.
The Akamai deal serves a different function than the SpaceX and Google commitments. Where Colossus and Google Cloud provide raw training and large-scale inference compute, Akamai’s distributed edge network positions Claude inference closer to end users globally. Akamai CTO Robert Blumofe told CNBC last week that the company already operates an AI inference cloud and plans to expand further.
Why Edge Compute Matters for Agents
Latency is a binding constraint for autonomous agents that execute multi-step workflows. An agent running a 15-step task accumulates round-trip delays on every inference call. Distributing inference to 130 countries reduces that overhead for enterprise customers operating globally.
“I think we’ve been undervalued for a while, and investors have been looking for some real validation that our different approach is going to pay off,” Leighton told CNBC. “We’re going to be in a great position to enable and secure the new AI economy.”
Anthropic’s Compute Appetite
Combined, Anthropic’s disclosed infrastructure commitments now exceed $200 billion across Google Cloud, SpaceX, and Akamai. The company is also reportedly exploring a $50 billion fundraising round that could value it near $1 trillion, according to the Financial Times. Much of that capital would fund further compute expansion.
For Akamai, the deal validates a strategic pivot from content delivery into AI infrastructure. Shares are now up 65% over the past 12 months.