Boston Dynamics’ Atlas humanoid robot just took a leap forward—literally and figuratively. In a newly released video, Atlas performs a long sequence of packing, sorting, and organizing tasks powered by a Large Behavior Model (LBM), a neural network that treats locomotion and manipulation as one unified system.
The demo, created with the Toyota Research Institute (TRI), highlights why this matters: for the first time, Atlas doesn’t rely on separately programmed systems for walking and object handling. Instead, the LBM controls the whole robot directly—hands, feet, balance, and body posture—without a single new line of hand-coded instructions. That means adding new capabilities is now less about painstaking coding and more about showing the robot what to do.
A Single Brain, Whole-Body Control
Previous humanoid demos often stitched together walking algorithms with distinct manipulation programs, creating robots that could either move or use their hands well, but not seamlessly combine the two.
The new Atlas demo changes that. Using whole-body movements—walking, crouching, lifting, and even adapting when researchers deliberately throw obstacles mid-task—the humanoid shows off something closer to human-like fluidity.
“Training a single neural network to perform many long-horizon manipulation tasks will lead to better generalization,” said Scott Kuindersma, VP of Robotics Research at Boston Dynamics. “Highly capable robots like Atlas present the fewest barriers to data collection for tasks requiring whole-body precision, dexterity, and strength.”
Why Large Behavior Models Are a Breakthrough
LBMs are to robot behavior what large language models (LLMs) are to text: scalable, adaptable systems that learn from demonstrations rather than hand-coded instructions.
Russ Tedrake, TRI’s SVP of Large Behavior Models, put it bluntly: “The previous approaches to programming these tasks simply could not scale. Large Behavior Models address this opportunity in a fundamentally new way—skills are added quickly via demonstrations from humans, and as the LBMs get stronger, they require fewer demonstrations to achieve more robust behaviors.”
That’s critical, because humanoids promise utility in existing environments—from warehouses and factories to hospitals and homes. The problem has always been scale: manually programming each new task was a bottleneck. LBMs could finally crack that barrier.
The Bigger Picture: Humanoids Are Heating Up
Boston Dynamics and TRI’s partnership, formed in October 2024, comes amid a humanoid arms race. Tesla’s Optimus, Figure AI’s Figure 01, and Agility Robotics’ Digit are all vying for a spot in warehouses and workplaces. While many competitors lean on LLM-based control for reasoning, Boston Dynamics and TRI are betting on LBMs for whole-body intelligence—arguably a more direct path to robots that move and work like humans.
If LBMs prove scalable, this could accelerate the march toward general-purpose humanoids—machines that don’t just do one task in a controlled demo, but adapt on the fly in messy, real-world environments. And Atlas, once a parkour stunt-bot for YouTube views, may finally be on track to become a serious platform for next-gen robotics research.
Power Tomorrow’s Intelligence — Build It with TechEdgeAI