Santa Clara: Every conversation about AI infrastructure starts with chips. It rarely ends there. Behind every GPU cluster training a frontier model sits a web of networking hardware that determines how fast those chips actually talk to each other.
That layer, unglamorous, invisible to most users, and increasingly mission-critical is what Nexthop AI is building. The company has just closed a $500 million Series B at a $4.2 billion valuation, with the round oversubscribed before it closed.
Lightspeed Venture Partners led the round. Andreessen Horowitz joined as a major new investor. Altimeter Capital and all existing backers also participated.
The raise is a sharp jump from Nexthop’s $110 million launch round in 2025. Lightspeed led that round too, alongside Kleiner Perkins and Battery Ventures. The fact that both firms returned and a16z chose this moment to enter, signals strong conviction.
Nexthop was founded in 2024 by Anshul Sadana. He spent 17 years at Arista Networks as COO, scaling its revenue from zero to over $5 billion. That pedigree shapes everything about how the company builds.
Nexthop does not make general-purpose networking gear. It builds switching systems for large AI clusters. Thousands of GPUs must exchange data at low latency, continuously, without dropping packets. Its products run on open-source operating systems and are co-developed directly with hyperscale customers.
The timing is not accidental. Hyperscalers are projected to spend $650 billion on AI data centers and related infrastructure in 2026 alone. That spending wave is creating a generation of infrastructure startups, not building AI models, but building the physical systems models cannot run without.
The AI networking market is projected to reach $100 billion by 2031, according to SemiAnalysis, and the companies building its plumbing are only beginning to attract the attention they deserve.
