Copyright Interesting Engineering

Inspired by the way birds and bats navigate complex environments, researchers at Worcester Polytechnic Institute (WPI) in Massachusetts are developing sound-based navigation for small aerial robots. The project aims to enable drones to operate in smoke, dust, and darkness where traditional cameras and light sensors fail—advancing resilient robotic perception for search, rescue, and hazardous environment missions. Professor Nitin Sanket, the lead researcher from WPI, has now secured a National Science Foundation (NSF) Foundational Research in Robotics grant, which will fund a $704,908, three-year project starting in September 2025 to develop sound-based navigation systems for tiny robots operating in challenging environments. “I was determined to pursue it independently because the project fit so perfectly. Receiving this grant, I feel very accomplished and re-energized to push the boundaries of bio-inspired robot perception forward,” said Sanket, in a statement. Resilient robot navigation For over a decade, research at WPI has focused on vision-based autonomy for aerial robots, emulating how humans rely on sight. However, in challenging conditions such as fog, smoke, or total darkness, light-based sensors lose effectiveness. To overcome these limitations, the team is now exploring bio-inspired echolocation, drawing from how bats use ultrasonic sound waves to sense their surroundings. The goal of the project is to develop tiny airborne robots that can navigate independently using sound rather than visual cues. These robots will be smaller than 100 millimeters and weigh less than 100 grams. Improvements in several fields are necessary to achieve this capacity. Researchers are addressing the challenge of noisy propellers and limited ultrasound resolution by using specially designed metamaterials that minimize sound interference. By altering material geometry, these structures control how sound waves reflect—similar to how foam absorbs noise. Inspired by how humans cup their ears or bats adjust ear shapes, the team is developing systems that better capture and emit low-power sound for navigation. Additionally, they are exploring alternative propulsion methods, including flapping-wing mechanisms, to enhance performance and reduce acoustic interference in compact aerial robots. By integrating these innovations, the team aims to develop compact, affordable, and energy-efficient drones that can operate in environments where traditional vision systems are ineffective. “This work will enable rapid deployment of robots in challenging environments such as disaster zones or smoke-filled areas. It’s about creating tools that support protection, prevention, and preservation in a cost-effective, scalable, and deployable way,” said Sanket, in a statement. Ultrasonic drone vision On the software front, the team is applying physics-informed deep learning to process and interpret ultrasonic signals for autonomous aerial navigation. A hierarchical reinforcement learning system enables the robots to move toward defined goals while dynamically avoiding obstacles. The neural networks are optimized for onboard performance, ensuring all computation occurs directly on the drone without external infrastructure. According to The Robort Report (TRR), the project aims to develop small, energy-efficient drone swarms that can operate in environments where conventional vision-based systems are ineffective by integrating bio-inspired artificial intelligence, robot perception, and adaptive learning. These drones can combine echolocation with inertial and other sensor data through the use of sensor fusion, which enhances situational awareness and navigation dependability in challenging circumstances. Future versions might make it possible to use ultrasonography to detect the heartbeats of survivors, improving search and rescue operations. Even if existing obstacle avoidance systems work effectively, research is being conducted to increase flight speeds above two meters per second, allowing real-world operations to respond more quickly, according to TRR. The team anticipates that these systems will transition from laboratory testing to field deployment within three to five years. Beyond rescue applications, the technology could play a vital role in disaster monitoring, hazardous environment inspection, and environmental protection, where traditional vision-based navigation is unreliable.