Morning Overview

Bat-inspired microdrones show promise for search-and-rescue in darkness

When a building collapses at night or a wildfire fills a structure with blinding smoke, the drones that rescuers send in to find survivors share a critical weakness with the human eye: they need light to see. A research team at Worcester Polytechnic Institute is trying to eliminate that vulnerability by borrowing a trick from bats. Their palm-sized aircraft navigate using ultrasound rather than cameras, bouncing high-frequency chirps off walls and obstacles to build a picture of spaces where visibility is zero.

Early lab results, published in preprints in 2024 and 2026, show the approach works even when signal quality drops far below the threshold that would blind a conventional drone. The question now is whether performance proven in a controlled lab can survive the chaos of an actual disaster site.

How the technology works

The WPI team, led by robotics professor Nitin Sanket, has developed two complementary systems. The first, called Saranga, is a low-power ultrasound perception stack designed for tiny aerial robots operating in visually degraded environments like fog, darkness, and snow. In testing, Saranga maintained its ability to navigate at a peak signal-to-noise ratio as low as negative 4.9 decibels, a level at which camera-based systems would register nothing useful. That metric matters because real disaster sites, choked with dust, smoke, and electrical interference, routinely push sensor signals well below usable levels.

The second system, BatDeck, is an ultrasound sensor board built specifically for nano-drone navigation in complete darkness. While Saranga focuses on perceiving degraded environments, BatDeck emphasizes obstacle avoidance and adds ego-velocity estimation, which lets a drone track its own speed without GPS or visual landmarks. That capability is designed for confined-space rescue missions where GPS signals cannot penetrate rubble and where visual tracking fails because there is simply nothing to see.

One persistent engineering problem is propeller noise. Rotor frequencies overlap with the ultrasound bands the drones use to echolocate, threatening to drown out the very echoes the system depends on. Sanket’s team addressed this with 3D-printed acoustic shells fitted around the rotors, acting as sound baffles that dampen mechanical noise enough for the onboard microphones to pick up environmental echoes during flight.

“When there’s a power outage at night, or when conditions are smoky or foggy, conventional drones become useless precisely when they’re needed most,” Sanket told the Associated Press, explaining why his lab chose ultrasound over the cameras and lidar that dominate commercial drone design.

Funding and the road ahead

The National Science Foundation signaled confidence in the approach in late 2025, awarding Sanket $704,908 over three years under the project title “Sound Navigation: Enabling Tiny Robots to Find Their Way Through Smoke, Dust, and Darkness.” The grant, part of the NSF’s Foundational Research in Robotics program, highlights bio-inspired sensing, low-power embedded processing, and navigation in cluttered, GPS-denied spaces as its core research themes. Funding runs through 2028.

On the standards side, the infrastructure for evaluating these drones already exists in outline. The National Institute of Standards and Technology maintains test methods for response robots that include dark-condition protocols, standardized obstacle courses, and quantitative measures for mobility, search capability, and sensing. Those NIST performance tests for aerial response robots have been incorporated into the national standard NFPA 2400, which covers drone maneuvers and functionality and involves standards bodies including NFPA and ASTM E54.09. Adam Jacoff, a key figure in NIST’s robotics standardization work, has helped translate research benchmarks into criteria that fire departments and emergency agencies can use when certifying equipment.

NIST has also published work connecting emergency-response drone testing to low-visibility hazards such as wildfire smoke, offering a reproducible framework that could eventually be applied to ultrasound-based systems. In principle, the same maze-like test arenas and obscurants used today to evaluate camera and lidar performance could be adapted to probe how well echolocating microdrones perceive walls, doorways, and trapped people in the dark.

What still needs to be proven

As of May 2026, every published result for Saranga and BatDeck comes from controlled laboratory settings. Neither system has been tested in an actual building collapse, a live wildfire, or a formal search-and-rescue exercise. The negative 4.9 dB signal-to-noise figure is impressive on paper, but real rubble fields generate unpredictable acoustic environments: shifting debris, wind gusts, and competing noise sources that are difficult to replicate in a lab. Whether the drones’ performance holds under those conditions is an open question.

Battery life is another gap in the public record. Ultrasound processing draws milliwatts of power, but sustained flight in a disaster zone demands energy for propulsion, radio communication, and onboard computation simultaneously. None of the available sources provide data on continuous flight time under realistic mission loads. For a technology aimed at time-critical emergencies, endurance will be a deciding factor. A sensor suite that works flawlessly is of limited value if the platform can only stay airborne for a few minutes once loaded with radios and safety redundancies.

Durability raises similar concerns. The current designs rely on 3D-printed acoustic shells and carefully tuned ultrasound transducers. How those components perform after impacts, water exposure, dust infiltration, or repeated thermal cycling has not been documented. The algorithms that interpret echo patterns have been validated against known obstacle layouts, but their behavior in chaotic interiors with moving people and falling debris has not been independently assessed.

There is also the question of how echolocation stacks up against existing alternatives. Thermal imaging cameras, the current workhorse for dark-condition search-and-rescue, can detect body heat through smoke and darkness. Ultrasound offers a different advantage: it maps physical geometry rather than heat signatures, potentially revealing structural hazards that thermal cannot. A mature rescue drone might eventually combine both. But for now, no head-to-head comparison between echolocating microdrones and thermal-equipped platforms has been published.

Finally, no connection between these specific drones and the NIST or NFPA 2400 test standards has been formally established. The standardization framework exists and includes dark-condition protocols, but whether Saranga or BatDeck has been evaluated against those benchmarks is not publicly documented. Without such testing, emergency agencies have no standardized way to compare echolocating drones against conventional platforms.

Where the science stands now

The strongest evidence supporting this technology comes from the Saranga and BatDeck preprints on arXiv, which provide specific experimental metrics, design details, and performance boundaries open to scrutiny by other researchers. The BatDeck paper dates to March 2024, representing earlier-stage work; the Saranga preprint is more recent. Neither has completed formal peer review, though the NSF grant and WPI institutional backing lend additional credibility. Press coverage, including reporting from the Associated Press, adds on-the-record context from Sanket but does not substitute for independent validation.

The most telling signal in the months ahead will be whether the WPI team publishes results from realistic disaster simulations or, better yet, from joint exercises with fire departments or urban search-and-rescue teams. NIST’s established obstacle courses and dark-condition trials offer a ready-made proving ground. Until echolocating nano-drones pass those kinds of independent, standardized tests, the best-supported conclusion is that this is an inventive, well-funded research direction with genuine potential to change how rescuers operate in the dark, but one that has not yet been tested where it matters most: in the rubble.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.