Antioch, a New York-based startup building simulation tools for robotics and autonomous systems developers, has raised an $8.5 million seed round at a $60 million valuation, TechCrunch reported today. The round was led by venture firm A* and Category Ventures, with participation from MaC Venture Capital, Abstract, Box Group, and Icehouse Ventures.
The company addresses what the robotics industry calls the sim-to-real gap: making virtual environments realistic enough that autonomous agents trained in simulation can operate reliably in the physical world.
“It’s the first time you can have autonomous agents iterate on a physical autonomy system, and actually close the feedback loop,” CEO Harry Mellsop told TechCrunch.
The Product
Antioch lets robot builders spin up multiple digital instances of their hardware and connect them to simulated sensors that replicate the data a robot’s software would receive in the real world. Developers can test edge cases, run reinforcement learning, or generate new training data, all without building physical test environments, according to TechCrunch.
The platform starts with physics models built by NVIDIA, World Labs, and others, then builds domain-specific libraries to make them accessible to robotics teams. The company’s current focus is on sensor and perception systems, which account for the bulk of simulation needs in automated vehicles, farm and construction machinery, and aerial drones, per TechCrunch’s reporting.
The fidelity challenge is core to the product: if the simulated physics doesn’t match reality, models trained in simulation fail when deployed on physical hardware. Working across multiple customers gives Antioch a cross-domain refinement advantage that no single robotics company could match building simulation internally, according to the company.
The Team and Market Context
Mellsop founded Antioch in May 2025 with four co-founders. Two, Alex Langshur and Michael Calvey, previously co-founded Transpose, a security and intelligence startup acquired by Chainalysis. The other two, Collin Schlager and Colton Swingle, came from Meta Reality Labs and Google DeepMind respectively, according to TechCrunch.
The simulation need is already validated at the top of the market. Waymo uses Google DeepMind’s world model to test its driving system in simulation, reducing the data collection required to deploy in new cities. But hyperscaler-built simulation tooling isn’t available to the broader market. “The vast majority of the industry doesn’t use simulation whatsoever,” Mellsop told TechCrunch.
Angel investor Adrian Macneil, who built Cruise’s data infrastructure before founding robotics data company Foxglove, backed the round. “Simulation is really important when you’re trying to build a safety case or dealing with very high-accuracy tasks,” Macneil said at the Ride.AI conference in San Francisco this week, per TechCrunch. “It’s not possible to drive enough miles in the real world.”
“What happened with software engineering and LLMs is just starting to happen with physical AI,” said Çağla Kaymaz, a partner at Category Ventures, per TechCrunch. “In the physical world, the stakes are much higher.”
Early Use Cases
MIT’s Computer Science and Artificial Intelligence Laboratory researcher David Mayo is using Antioch’s platform to evaluate LLMs in physical contexts. In one experiment described by TechCrunch, Mayo has AI models design robots, then tests them in Antioch’s simulator, including pitting models against each other in simulated physical contests. The approach provides a new benchmarking paradigm for AI models operating in physical environments.
While the startup pitches primarily to other startups, some of its earliest engagements have been with large multinationals already investing heavily in robotics, according to TechCrunch. Mellsop projects that within two to three years, “anyone building an autonomous system for the real world is going to do so in software primarily.”