Robots That Work With People, Not Instead of Them

Robots That Work With People, Not Instead of Them

Today's Overview

This week reveals a quiet shift in how we're thinking about robots and AI systems. It's not about replacing humans anymore-it's about building tools that humans actually want to work alongside.

Warehouse Robots That Don't Alienate Workers

Robust.AI's Anthony Jules is speaking at the Robotics Summit on something most automation companies avoid: what makes robots people want to use them? The conversation isn't about throughput or cost savings. It's about designing systems that reduce strain on workers, integrate with existing infrastructure without requiring expensive overhauls, and feel like partners rather than replacements. Carter, their mobile robot, was built around this principle-it's designed to work *with* warehouse staff on picking and putaway tasks, not to eliminate the jobs entirely.

This matters because most automation failures happen not because the technology doesn't work, but because workers resent it. The companies that figure out human-centred automation will win. The ones that don't will watch their robots gather dust.

The Hidden Architecture Behind AI Coding Tools

Sebastian Raschka published something genuinely useful this week: a breakdown of how coding agents actually work. When you use Claude Code or Cursor, you're not just talking to a model. You're using a sophisticated system with six core components-live repo context, prompt caching, structured tool use, context reduction, session memory, and bounded delegation. Each piece matters. Most of the apparent "intelligence" of these systems comes not from the model itself, but from how the harness manages context, keeps files in scope, and prevents the model from hallucinating nonsense. A good harness can make a mid-tier model outperform a raw, more powerful one.

The implication: if you're building with AI, you're not just choosing a model. You're building a system around it. The teams that get this right will ship faster and more reliably than those who don't.

The Compute Rationing Has Begun

Azeem Azhar's note this week is sobering: AI labs are passing on business because they don't have the compute. OpenAI's CFO said so directly. Anthropic has tightened session limits. H100 rental prices hit an 18-month high. This isn't a bubble-it's a constraint. The companies with the most GPUs are turning away money. That changes the dynamics of the entire industry. Smaller builders will need to be smarter about efficiency. Larger players will consolidate. And the narrative shifts from "AI will revolutionise everything" to "AI will revolutionise everything we can actually afford to run."

The practical lesson: optimisation matters now in a way it didn't six months ago.