Aria Networks raised $125 million in its first funding round to build networking infrastructure purpose-built for data centers running AI and agent workloads. The round was backed by Sutter Hill Ventures, Atreides Management, Valor Equity Partners, and Eclipse Ventures, according to Tech Startups and confirmed by Reuters.

The Palo Alto startup, founded in 2025, has already moved from founding to production deployments in 15 months.

The Problem: Networks as Bottleneck

The pitch centers on a specific bottleneck. GPU-dense data centers are scaling compute capacity rapidly, but the networking layer connecting those GPUs was designed for general-purpose traffic, not the burst patterns and multi-hop communication that AI inference and agent orchestration produce.

“The race to scale AI has exposed a quiet bottleneck: the networks that move data inside modern data centers,” Tech Startups reported.

Aria’s “Deep Networking platform” treats the network as an active participant in workload optimization rather than a passive pipeline. The system is designed to work across AI chips from multiple vendors, including Nvidia and Google, allowing operators to upgrade hardware or switch vendors without rebuilding their network stack, according to the Tech Startups report.

Token Efficiency as a Metric

Aria frames its value proposition around “token efficiency,” measuring how much useful AI output a data center produces relative to the cost of running it. In practical terms, the company optimizes how data moves between model inference endpoints, reduces latency for multi-hop agent calls, and adapts routing in real time as workloads shift.

The concept matters as agent workloads increasingly distribute across multiple GPU nodes and cloud providers like CoreWeave, Lambda, and Crusoe. A single agent request might trigger inference calls across several endpoints, making the networking layer between them a performance-critical path.

Speed to Market

The company emphasized its pace. “In just over a year, we have gone from founding to funding to fielding customers in production,” Aria wrote in a blog post cited by Tech Startups. Atreides managing partner Gavin Baker joined the board alongside Stefan Dyckerhoff of Sutter Hill and Aria’s founding team.

Infrastructure Stack Deepening

The $125 million round fits a broader pattern of capital flowing into every layer of the agent infrastructure stack. In the past week alone, CoreWeave signed a multi-year GPU capacity deal with Anthropic, Sarvam AI neared closing $350 million for sovereign AI infrastructure in India, and Trent AI raised $13 million for agent runtime security. The signal is consistent: building and running agents at scale requires specialized infrastructure across models, deployment, security, and now networking. For teams operating agent fleets across distributed compute, the networking layer is becoming the next optimization surface.