World models vs language: the AI fork in the road

World models vs language: the AI fork in the road

Today's Overview

There's a fascinating tension building in AI right now, and it's crystallising around a fundamental question: are we building intelligence the right way? On one side, the current consensus says scaling language models and teaching them to use tools gets us closer to AGI. On the other side, Yann LeCun just walked away from Meta with over a billion dollars to bet on a different approach entirely.

The AMI Labs bet on understanding the physical world

LeCun's new startup, Advanced Machine Intelligence (AMI Labs), raised €890 million (roughly $1.03 billion) in what's being called one of the largest seed rounds ever. The company is explicitly positioning itself as a counterweight to the current AI orthodoxy. Instead of next-token prediction, AMI is building world models - systems that learn to understand and predict how the physical world actually works, rather than how people describe it in text.

This isn't new thinking from LeCun. He's been arguing for years that human-level AI will come from systems that learn by observing and acting in the real world, not just predicting the next word. But now he has capital, a team of world-model researchers, and explicit backing from French government to test whether this thesis can actually deliver products. The founding team includes Saining Xie as chief scientist, plus researchers who've spent careers on vision, representation learning, and robotics. They're not trying to build a better chatbot - they're trying to build systems that truly perceive, reason about, and act in the world.

Humanoids moving from demos to real work

Meanwhile, the robotics world is watching humanoid robots transition from spectacular demos to actual industrial deployments. At the upcoming Robotics Summit in May, leaders from Agility Robotics, Boston Dynamics, and ASTM are holding a keynote specifically titled "The State of Humanoid Robotics" - and the framing is deliberately grounded. They're cutting through the hype to ask what humanoids can actually accomplish in factories and warehouses right now. Agility's Digit, Boston Dynamics' Atlas, and others are no longer just YouTube moments - they're being tested in real logistics and manufacturing settings. The progress is real, but the honest conversation is about current limitations, safety standards, and what still needs to improve for broader adoption.

In parallel, surgeons are getting their sense of touch back. A European research project called PALPABLE is developing soft robotic fingertips with fibre-optic sensing that let surgeons feel tissue stiffness during minimally invasive surgery. For years, the shift to robotic surgery gained precision but lost tactile feedback. This work restores that vital feedback - translating physical pressure into visual maps of tissue firmness. A prototype should be tested by surgeons around now.

The harder conversation about AI and people

Underneath all this technological momentum is a messier reality. Public sentiment toward AI has turned sharply negative - not out of simple Luddism, but because of timing, trust erosion, and genuine anxiety about creative work and economic security. One thoughtful analysis points out that AI arrived at the worst possible moment: after social media scandals eroded tech trust, during economic uncertainty when jobs matter more, and precisely when it started threatening creative work. Previous technological waves - automobiles, television, the internet - faced scepticism too, but AI is different because it targets the top of Maslow's pyramid. It encroaches on identity and self-expression in ways that earlier automation simply didn't.

The real challenge for the AI industry isn't technical anymore - it's cultural and institutional. LeCun and others understand that building systems people actually trust to work in hospitals, factories, and homes requires a fundamental shift: from showcasing what AI can do, to demonstrating what problems it solves for the people actually using it. From VCs talking about capabilities, to surgeons, nurses, and factory workers showing how the technology genuinely changes their work. That's where the conversation needs to happen next.