London: Aria Networks has raised USD 125 million to build networking infrastructure designed specifically for AI workloads. The Palo Alto startup launched its Deep Networking platform alongside the funding, marking its debut after just 15 months in operation.
The round was backed by Sutter Hill Ventures, Atreides Management, Valor Equity Partners, and Eclipse Ventures. Gavin Baker of Atreides and Stefan Dyckerhoff of Sutter Hill joined the board. Founder Mansour Karam previously built Apstra, an intent-based networking startup sold to Juniper Networks for approximately USD 190 million in 2021.
Aria’s pitch is straightforward. Today’s data centre networks were not built for AI. General-purpose infrastructure creates bottlenecks that limit how efficiently AI clusters process workloads. Aria builds hardware and software designed from the ground up for this problem.
The platform works across AI chips from Nvidia, Google, AMD, and others. That hardware-agnostic approach lets operators upgrade or switch chips without rebuilding their network stack, a meaningful promise at a time when chip supply and model demands shift constantly. Deep Networking is already live and serving customers.
The company centres its performance claims on a metric called token efficiency. This calculates how much useful AI output a data centre produces relative to cost. Karam told Network World that the network directly affects every component in an AI cluster.
Aria’s own modelling shows a 3% improvement in model utilisation across a 10,000-accelerator cluster translates to roughly USD 49.8 million in annual revenue gains.
The platform collects telemetry at microsecond granularity directly from switch ASICs, far higher resolution than traditional network monitoring. That data feeds adaptive tuning of load balancing and congestion logic in real time. Aria also exposes an MCP server, letting external systems query network state and integrate it into their own decision-making.
