Kevin Peterson spent years working on self-driving cars before becoming CTO of Bedrock Robotics, where he's now building autonomous systems for construction equipment. The jump from highways to building sites isn't as strange as it sounds - and the lessons he brought with him are reshaping how industrial robotics actually gets deployed.
The core insight: self-driving cars and autonomous bulldozers share more constraints than differences. Both operate in unstructured environments where things change constantly. Both need to avoid hitting people and objects. And both face the same fundamental challenge - you can't test every scenario in the real world, so simulation becomes essential.
The Simulation Problem
Here's what makes industrial robotics harder than it looks. A construction site has vehicles, workers, moving equipment, changing terrain, and weather conditions. You can't program for every possibility. You need systems that learn from experience - but getting that experience safely is the bottleneck.
Peterson's approach combines real-world data collection with high-fidelity simulation. Real sites provide the edge cases and unexpected scenarios. Simulation provides the volume - thousands of hours of training in conditions that would be too dangerous, expensive, or slow to test physically.
The self-driving car industry learned this the hard way. Early systems relied too heavily on real-world testing, which was slow and risky. Later systems swung too far towards simulation, which created brittle models that failed in real conditions. The balance - real data informing realistic simulations - is what actually works.
Why Construction Equipment is Different
Autonomous cars need to navigate roads and avoid collisions. Autonomous construction equipment needs to build things while doing the same. That's a more complex task - the machine isn't just moving through space, it's manipulating that space according to plans that change during the project.
Peterson points out another critical difference: construction sites are GPS-denied or GPS-degraded environments. Buildings, excavations, and equipment block signals. That means systems can't rely on external positioning the way cars do. They need sensor fusion - combining cameras, lidar, IMUs (inertial measurement units), and local positioning systems to maintain accuracy.
This is where simulation becomes even more valuable. You can model GPS dropout, test recovery strategies, and train systems to operate with degraded inputs - all without risking actual equipment or people on site.
Real-World Application
Bedrock's systems aren't fully autonomous in the "set and forget" sense. They're designed for supervised autonomy - operators monitor multiple machines, intervening when needed, but not manually controlling every movement. Think air traffic control rather than piloting.
This model reflects a practical reality: construction work is too variable and high-stakes for unsupervised automation with current technology. But it's also too repetitive and labour-intensive not to automate. The middle ground - intelligent assistance with human oversight - is where the actual value lives.
The Scaling Challenge
What makes this work at scale is the feedback loop. Real-world deployments generate data that improves simulations. Better simulations produce more robust models. More robust models reduce the need for human intervention. That cycle compounds - slowly at first, then faster as the system accumulates experience.
Peterson's background in self-driving cars gave him a head start on this loop. The automotive industry spent billions learning how to collect, label, and use real-world driving data. Construction robotics is following a similar path, but compressed - learning from those lessons rather than repeating them.
What This Means Practically
The construction industry faces a labour shortage and rising costs. Autonomous equipment won't solve that overnight, but it changes what's possible. One operator supervising three machines does more work than one operator controlling one machine. That's not sci-fi - it's happening on sites now.
The bigger shift is in capability. Work that's currently too dangerous or precise for humans - grading slopes to millimetre tolerance in unstable conditions - becomes routine. That's not about replacing workers; it's about doing work that wasn't previously practical.
Peterson's point throughout is that industrial robotics isn't about perfect autonomy. It's about practical augmentation - systems that combine machine precision with human judgement. That's the approach that scales. And judging by Bedrock's deployments, it's the approach that works.