Scientists have built a working simulation of an entire adult fruit fly brain and connected it to a physics-based virtual body, producing what amounts to a digital insect that can process sensory input and generate movement. The effort, spread across a package of nine papers published in Nature, represents the first time researchers have taken a complete brain wiring diagram, turned it into a functioning spiking neural model, and let it drive a simulated organism. The result is not just a map of the fly’s roughly 139,255 neurons but a testable, dynamic system that generates predictions about how real flies see, walk, and fly.
Mapping Every Synapse in a Fly’s Brain
The foundation for the entire project is a synapse-resolution wiring diagram of an adult female Drosophila brain containing approximately 139,255 neurons and roughly 50 million chemical synapses. Reconstructed from electron microscopy data, this connectome traces every neural connection in the brain at a level of detail that was, until recently, only available for far simpler organisms like the roundworm C. elegans, which has just 302 neurons. Getting from raw imaging data to a usable map required years of collaborative annotation, grouping neurons into classes and cell types so that each node in the network could be interpreted biologically.
A companion study systematized the cell classes across the FlyWire whole-brain connectome, organizing thousands of neuron types, many of them newly defined. That classification work matters because a wiring diagram alone is like a circuit board with no labels. Without knowing which neurons are excitatory and which are inhibitory, researchers cannot predict how signals will propagate. A separate paper published in Cell tackled exactly that problem, demonstrating that neurotransmitter identity can be inferred directly from electron microscopy images at synaptic sites. That technique assigned chemical “signs” to connections across both the FlyWire and Hemibrain datasets, converting a static structural map into something that could drive a dynamic simulation.
From Wiring Diagram to Spiking Brain
With the connectome mapped and neurotransmitter identities assigned, a team built a whole-brain computational model using leaky integrate-and-fire spiking neurons, a standard approach in computational neuroscience that balances biological realism against computational cost. Each of the brain’s roughly 139,255 neurons was represented as a node whose activity depends on incoming signals from connected neurons, weighted by the synapse counts and neurotransmitter types extracted from the connectome. The model was implemented in the Brian2 simulator, an open-source tool designed for spiking neural networks that can scale to networks of this size while still running on commodity hardware.
What makes this simulation different from earlier brain models is completeness. Previous fly brain simulations worked with partial circuits or isolated neural pathways, such as visual motion detectors or olfactory networks. This model includes the entire brain, which means signals can flow through feedback loops and cross-brain connections that partial models necessarily miss. The team used the simulation to generate predictions about sensorimotor processing, testing how the virtual brain responds to visual and other sensory inputs and what motor commands it produces. Those predictions are testable against real fly behavior, giving experimentalists specific hypotheses to confirm or reject in the lab.
According to a summary from Berkeley researchers, the model is efficient enough to run on a laptop, rather than requiring a supercomputer. That accessibility could democratize large-scale brain simulations, allowing smaller labs to explore how perturbations to specific neurons or synapses ripple through the network to alter behavior. It also opens the door to systematic “in silico” experiments that would be impractical or impossible to perform in living animals, such as silencing thousands of neurons at once or rewiring entire circuits to test alternative architectures.
Giving the Brain a Body
A brain simulation that floats in mathematical space can only tell researchers so much. Real brains evolved to control bodies, and the relationship between neural output and physical movement is deeply shaped by biomechanics. That is why a separate team developed NeuroMechFly v2, an anatomically detailed neuromechanical model of an adult Drosophila described in Nature Methods. The virtual fly body includes articulated joints, realistic limb segments, and sensor and actuation models that mimic how a real fly’s muscles and sensory organs interact with the physical world, from ground reaction forces under its feet to the inertia of its wings.
Connecting the spiking brain model to this virtual body closes a loop that neuroscientists have long wanted to study: the brain sends motor commands to the body, the body moves through its environment, and the resulting sensory feedback flows back into the brain, which adjusts its commands accordingly. This closed-loop setup can reveal emergent behaviors that static brain models overlook entirely. A brain model might predict that a certain pattern of neural activity should produce forward walking, but only when that signal passes through a physically realistic body can researchers see whether the gait is stable, efficient, or even functional. Because both the neural and biomechanical components are grounded in measured anatomy and connectivity, the resulting behaviors provide concrete targets for experimental verification.
The embodied simulation also allows researchers to explore how different sensory modalities interact. For example, visual motion signals can be combined with mechanosensory feedback from the legs to stabilize walking, or with airflow cues to adjust wingbeats during flight. By systematically turning these feedback channels on and off in the virtual fly, scientists can probe which pathways are essential for particular behaviors and which are redundant or modulatory. That level of control is difficult to achieve in vivo, where compensatory mechanisms often mask the effects of targeted lesions or silencing.
What Static Models Miss
Most coverage of this work has focused on the sheer scale of the connectome and the technical achievement of simulating it. But the more consequential advance may be the decision to pair brain simulation with physical embodiment. Neuroscience has spent decades debating whether understanding the brain requires understanding the body it controls. This project offers a concrete testing ground for that question, one where hypotheses about embodied cognition can be translated into explicit models and then challenged with behavioral data.
Consider the difference between analyzing a flight controller’s software in isolation and watching it actually fly a plane. The software might look correct on paper, but aerodynamic forces, sensor noise, and mechanical lag all shape the real outcome. The same logic applies here. The fly brain did not evolve to solve abstract computational problems. It evolved to keep a two-milligram insect alive in a turbulent, unpredictable world. Studying its neural circuits without accounting for wing aerodynamics, leg mechanics, and sensory delays strips away the very pressures that shaped those circuits. The embodied simulation restores that context, at least in part, by embedding the brain in a world where gravity, friction, and inertia matter.
A news analysis in Nature frames the connectome and its associated tools as enabling resources for the broader neuroscience community. That framing is accurate but incomplete. The real test is whether the predictions generated by the embodied simulation hold up when checked against living flies. If they do, the approach validates a new way of doing neuroscience: build the brain in software, give it a body, and see what behaviors emerge without needing to run every experiment on animals. If they do not, the discrepancies will still be valuable, pointing to missing ingredients such as neuromodulators, developmental changes, or plasticity rules that are not yet captured in the model.
For now, the digital fly is best understood as a hypothesis generator rather than a finished replica. Its neurons are simplified, its synapses lack many molecular details, and its virtual environment is far less complex than a real fruit bowl or forest floor. Yet even in this stripped-down form, the system demonstrates how far bottom-up neuroscience has come. A complete wiring diagram, enriched with neurotransmitter identities and cell-type classifications, can be transformed into a functioning brain model that controls a body and navigates a world. Each new dataset, whether a refined connectome, better biomechanical measurements, or more realistic sensory inputs, can be slotted into the framework, gradually tightening the link between simulation and reality.
As researchers iterate on this digital insect, they are also testing the limits of a broader ambition: using detailed simulations to bridge levels of explanation from synapses to behavior. Fruit flies offer a tractable starting point, small enough to map exhaustively yet complex enough to display rich behaviors like courtship, navigation, and learning. Whether similar strategies can eventually scale to larger brains remains an open question. But the fly work shows that, at least for one tiny animal, it is now possible to watch a brain made of equations perceive a world made of numbers, and to learn something real about biology in the process.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.