Morning Overview

AI evolved robot designs in simulation, then researchers built them

A team led by Northwestern University’s Sam Kriegman built an AI system that designed a walking robot from scratch in seconds, then manufactured the result and watched it move across a lab bench. The algorithm started with a featureless block of virtual material and, through rapid iterative optimization, produced a small soft-bodied machine with three legs and rear fins that no human engineer would have conceived. The work, described in a 2023 paper in the Proceedings of the National Academy of Sciences, represents one of the clearest demonstrations so far of AI closing the loop between simulated evolution and physical fabrication.

From Jiggling Blob to Walking Machine

The system began with a simple simulated block that could jiggle but not walk. As the algorithm evaluated this initial shape, it recognized that the motion fell short of its locomotion objective and began to modify the virtual material. According to a Northwestern engineering summary, the computer repeatedly reshaped the blob, each time testing how far it could move and feeding those results back into the next iteration.

In the simulation, the robot’s body was represented as a collection of discrete particles, internal voids, and muscle-like patches that could expand and contract. This simplified but expressive model gave the optimization process a flexible vocabulary of structural elements to rearrange. Each generation reshuffled those elements, ran a physics-based test of locomotion performance, and preserved the best-performing variants for further refinement. Over many such cycles, the clumsy jiggling block evolved into a design that could translate its internal actuation into forward motion.

What emerged was a body plan unlike anything in nature or conventional robotics. The final design featured three stubby legs and fin-like protrusions at the rear, an asymmetric arrangement that would be unlikely to appear on any engineer’s whiteboard. While nature took billions of years to evolve the first walking species, this algorithm compressed a comparable search process into seconds on a single computer. The PNAS authors emphasize that this speed is the central technical achievement: not just that the robot walked, but that the design pipeline ran fast enough to be practically useful for iterative experimentation.

How the Simulation Bridges to Reality

Designing a robot in simulation is only half the challenge. The harder question is whether simulated behavior survives contact with the real world, where friction, gravity, and material imperfections behave differently than their digital approximations. To narrow this “reality gap,” the researchers kept their internal model deliberately simple. In their technical description of the simulator, they describe how particles, voids, and muscle patches are chosen to map cleanly onto materials that can actually be cast and assembled in a lab.

Once the AI settled on a promising morphology, the team manufactured the design using soft silicone and air-powered actuators that could inflate and deflate like muscles. The casting molds approximated the particle layout, and channels were embedded to route pressurized air. When the finished robot was placed on a flat surface and the actuators were cycled, it lurched forward with slow but steady locomotion. The motion was not fast or graceful, but crucially, it matched the gait predicted by the simulation. That alignment between virtual and physical performance is what validates the pipeline, even though the authors acknowledge that translating particle-level designs into castable molds still introduces compromises and small deviations.

Roots in the Xenobot Experiments

This work did not appear in isolation. Kriegman and several collaborators had already demonstrated a version of the same idea using living cells instead of silicone. In earlier xenobot research, they used an in-silico evolutionary design pipeline to shape clusters of frog skin and heart cells into tiny biological machines. The computer proposed candidate body plans, and microsurgeons then assembled these designs from real cells. Lab experiments confirmed that the resulting constructs could move, corral particles, and exhibit behaviors that matched their simulated counterparts.

A follow-on study pushed the xenobot line further, showing that under specific conditions these reconfigurable organisms could engage in kinematic self-replication by sweeping up loose cells into new aggregates that inherited their basic form and behavior. The constraints and approximations used to model biological tissue in those simulations informed the design choices in the newer soft-robot project. Where the xenobot experiments proved that AI-designed body plans could function when built from living material, the 2023 PNAS work demonstrated that the same conceptual pipeline could operate with synthetic components that are easier to scale, standardize, and deploy outside specialized bio-labs.

A Growing Pipeline Beyond One Lab

The Northwestern team is not alone in pursuing AI-driven morphology search. A related effort, described in an arXiv preprint on differentiable robotics, combines evolutionary search over body shapes with gradient-based learning inside simulations whose physics are made differentiable. In that project, the optimizer can compute exact gradients through the dynamics engine, allowing it to adjust both the robot’s morphology and its control policy in ways that directly improve task performance.

As in the Northwestern study, at least one morphology from the differentiable pipeline was physically built and shown to retain its intended behavior in the real world. The use of differentiable simulation is significant because it promises more sample-efficient search: instead of blindly trying many random variations, the algorithm can follow gradient information toward better designs. Together, these parallel efforts suggest a broader shift in how robots might be conceived. Traditional robotics starts with human intuition about what a machine should look like, then refines control software to make that form functional. The emerging alternative inverts that sequence: define a task, let an algorithm explore the space of possible bodies, and build whatever strange shape scores highest on performance metrics.

Capabilities and Current Limits

For all its speed and novelty, the current system has clear boundaries. The robots it produces are small, slow, and limited to basic locomotion on flat, controlled surfaces. The PNAS article on instant soft-robot evolution reports only initial physical tests, with no long-term durability data or demonstrations in cluttered, outdoor, or highly variable environments. The silicone bodies and air-driven actuators are also constrained in the forces they can exert, which limits the range of tasks they can perform.

Scaling the approach to more complex behaviors (such as manipulation, climbing, or operating in unstructured terrain) will likely require richer material models, more sophisticated actuation strategies, and tighter integration between morphology and control. There is also the question of interpretability: the evolved designs work, but it is not always obvious why they work, or how small modifications might affect performance. That opacity can make it harder for human engineers to trust or adapt AI-generated morphologies for safety-critical applications.

Nonetheless, the core achievement stands: an AI system can now generate a viable robot body from a blank slate in seconds, and that body can be built and shown to function in the real world with minimal hand-tuning. By linking fast evolutionary search, a carefully simplified simulator, and practical fabrication methods, the Northwestern team and others have sketched a new workflow for robotics, one where the shapes of future machines are discovered more than they are designed.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.