MilliWatt Ultrasound Enables Palm Drone Navigation in Dense Fog
Saranga uses dual sonar array with deep learning denoising for low SNR conditions. Palm-sized aerial robots navigate fog, darkness, and snow with thin obstacles using on-board milliWatt computation.
TL;DR
Researchers have developed Saranga, an ultrasound-based navigation system for palm-sized drones that operates in dense fog, darkness, and snow where vision systems fail. The system uses a dual sonar array with deep learning denoising to overcome low signal-to-noise ratio conditions, achieving milliWatt power consumption suitable for micro aerial robots.
Key Facts
- Who: Research team presenting Saranga ultrasound navigation system
- What: MilliWatt ultrasound navigation, dual sonar array with deep learning denoising
- When: March 2026, paper released on arXiv (2603.24699)
- Impact: Enables micro drone operations in conditions where GPS and cameras are unreliable
What Happened
A research team has demonstrated Saranga, an ultrasound-based navigation system designed for palm-sized aerial robots operating in degraded visual environments. The system addresses a fundamental limitation of current micro drones: reliance on cameras and GPS that fail in fog, darkness, snow, or other low-visibility conditions.
The technical innovation combines a dual sonar array with deep learning-based denoising to handle low signal-to-noise ratio (SNR) conditions. Traditional ultrasound ranging struggles with noise from multiple reflections and environmental interference. Saranga’s deep learning component filters these artifacts, extracting reliable distance measurements from noisy signals.
The power consumption—milliWatts rather than Watts—makes the system viable for palm-sized drones with limited battery capacity. Demonstrations showed successful navigation in dense fog, complete darkness, and snow conditions, including detection of thin and transparent obstacles that would be invisible to standard sensors.
Key Details
Saranga introduces several innovations for micro drone navigation:
-
Dual Sonar Array: Two ultrasound sensors provide spatial coverage while maintaining the minimal weight and power profile required for palm-sized platforms
-
Deep Learning Denoising: Neural network-based signal processing extracts reliable range measurements from low SNR conditions, filtering out noise from environmental interference
-
MilliWatt Power: Total system power consumption remains within the milliWatt range, critical for micro aerial robots with gram-scale payload budgets
-
All-Weather Operation: Demonstrated navigation in fog, darkness, and snow—conditions where vision systems degrade or fail entirely
-
Thin Obstacle Detection: Capable of detecting thin and transparent obstacles that standard sensors miss
| Capability | Traditional Sensors | Saranga |
|---|---|---|
| Fog Navigation | Degraded/Fails | Operational |
| Dark Navigation | Fails | Operational |
| Snow Navigation | Degraded | Operational |
| Thin Obstacles | Often missed | Detected |
| Power Consumption | Watts | MilliWatts |
🔺 Scout Intel: What Others Missed
Confidence: high | Novelty Score: 75/100
Vision-based navigation dominates the drone industry, but SAR operations, agricultural spraying, and infrastructure inspection frequently encounter fog, dust, or darkness. The milliWatt power figure is the constraint that matters: existing ultrasound systems require too much power for gram-scale drones, forcing operators to choose between sensor payload and flight time. Saranga demonstrates that edge computing on micro platforms can handle the denoising that previously required ground station processing. For warehouse inventory drones navigating between shelving in dusty environments, or search-and-rescue in smoke-filled buildings, this extends operational envelopes without the weight penalty of LiDAR or the GPS dependency of outdoor systems. The thin obstacle detection also addresses a known failure mode for optical sensors: wire fences, glass walls, and netting that cameras miss.
Key Implication: Operators planning micro drone deployments in degraded visual environments should evaluate ultrasound-based navigation as an alternative to multi-sensor fusion, potentially reducing system complexity while extending operational range.
What This Means
For Micro Drone Manufacturers
The milliWatt power profile enables ultrasound navigation on platforms where LiDAR or complex sensor fusion would be prohibitively heavy. Palm-sized drones targeting warehouse, agricultural, or inspection applications can now claim all-weather capability without significant payload penalty.
For Industrial Applications
Facilities with dusty, foggy, or low-light conditions—grain elevators, mining operations, chemical plants—have limited autonomous drone options. Saranga demonstrates that micro drones can operate reliably in these environments, expanding the addressable market for industrial aerial robotics.
What to Watch
- Commercial integration: Monitor whether established drone manufacturers announce ultrasound navigation add-ons or native integration
- Flight time impact: Watch for battery life comparisons between ultrasound and vision-only navigation under field conditions
- Regulatory approval: Track whether aviation authorities accept ultrasound navigation as sufficient for beyond-visual-line-of-sight (BVLOS) operations in restricted visibility
Sources
- Saranga: Ultrasound Navigation for Micro Drones — ArXiv cs.RO, March 2026
MilliWatt Ultrasound Enables Palm Drone Navigation in Dense Fog
Saranga uses dual sonar array with deep learning denoising for low SNR conditions. Palm-sized aerial robots navigate fog, darkness, and snow with thin obstacles using on-board milliWatt computation.
TL;DR
Researchers have developed Saranga, an ultrasound-based navigation system for palm-sized drones that operates in dense fog, darkness, and snow where vision systems fail. The system uses a dual sonar array with deep learning denoising to overcome low signal-to-noise ratio conditions, achieving milliWatt power consumption suitable for micro aerial robots.
Key Facts
- Who: Research team presenting Saranga ultrasound navigation system
- What: MilliWatt ultrasound navigation, dual sonar array with deep learning denoising
- When: March 2026, paper released on arXiv (2603.24699)
- Impact: Enables micro drone operations in conditions where GPS and cameras are unreliable
What Happened
A research team has demonstrated Saranga, an ultrasound-based navigation system designed for palm-sized aerial robots operating in degraded visual environments. The system addresses a fundamental limitation of current micro drones: reliance on cameras and GPS that fail in fog, darkness, snow, or other low-visibility conditions.
The technical innovation combines a dual sonar array with deep learning-based denoising to handle low signal-to-noise ratio (SNR) conditions. Traditional ultrasound ranging struggles with noise from multiple reflections and environmental interference. Saranga’s deep learning component filters these artifacts, extracting reliable distance measurements from noisy signals.
The power consumption—milliWatts rather than Watts—makes the system viable for palm-sized drones with limited battery capacity. Demonstrations showed successful navigation in dense fog, complete darkness, and snow conditions, including detection of thin and transparent obstacles that would be invisible to standard sensors.
Key Details
Saranga introduces several innovations for micro drone navigation:
-
Dual Sonar Array: Two ultrasound sensors provide spatial coverage while maintaining the minimal weight and power profile required for palm-sized platforms
-
Deep Learning Denoising: Neural network-based signal processing extracts reliable range measurements from low SNR conditions, filtering out noise from environmental interference
-
MilliWatt Power: Total system power consumption remains within the milliWatt range, critical for micro aerial robots with gram-scale payload budgets
-
All-Weather Operation: Demonstrated navigation in fog, darkness, and snow—conditions where vision systems degrade or fail entirely
-
Thin Obstacle Detection: Capable of detecting thin and transparent obstacles that standard sensors miss
| Capability | Traditional Sensors | Saranga |
|---|---|---|
| Fog Navigation | Degraded/Fails | Operational |
| Dark Navigation | Fails | Operational |
| Snow Navigation | Degraded | Operational |
| Thin Obstacles | Often missed | Detected |
| Power Consumption | Watts | MilliWatts |
🔺 Scout Intel: What Others Missed
Confidence: high | Novelty Score: 75/100
Vision-based navigation dominates the drone industry, but SAR operations, agricultural spraying, and infrastructure inspection frequently encounter fog, dust, or darkness. The milliWatt power figure is the constraint that matters: existing ultrasound systems require too much power for gram-scale drones, forcing operators to choose between sensor payload and flight time. Saranga demonstrates that edge computing on micro platforms can handle the denoising that previously required ground station processing. For warehouse inventory drones navigating between shelving in dusty environments, or search-and-rescue in smoke-filled buildings, this extends operational envelopes without the weight penalty of LiDAR or the GPS dependency of outdoor systems. The thin obstacle detection also addresses a known failure mode for optical sensors: wire fences, glass walls, and netting that cameras miss.
Key Implication: Operators planning micro drone deployments in degraded visual environments should evaluate ultrasound-based navigation as an alternative to multi-sensor fusion, potentially reducing system complexity while extending operational range.
What This Means
For Micro Drone Manufacturers
The milliWatt power profile enables ultrasound navigation on platforms where LiDAR or complex sensor fusion would be prohibitively heavy. Palm-sized drones targeting warehouse, agricultural, or inspection applications can now claim all-weather capability without significant payload penalty.
For Industrial Applications
Facilities with dusty, foggy, or low-light conditions—grain elevators, mining operations, chemical plants—have limited autonomous drone options. Saranga demonstrates that micro drones can operate reliably in these environments, expanding the addressable market for industrial aerial robotics.
What to Watch
- Commercial integration: Monitor whether established drone manufacturers announce ultrasound navigation add-ons or native integration
- Flight time impact: Watch for battery life comparisons between ultrasound and vision-only navigation under field conditions
- Regulatory approval: Track whether aviation authorities accept ultrasound navigation as sufficient for beyond-visual-line-of-sight (BVLOS) operations in restricted visibility
Sources
- Saranga: Ultrasound Navigation for Micro Drones — ArXiv cs.RO, March 2026
Related Intel
FODMP Generates Robot Trajectories 10x Faster Than MPD Baseline
FODMP distills diffusion models into ProDMP trajectory space, generating motion in a single step. Runs 10x faster than MPD, 7x faster than action-chunking, enabling real-time ball interception.
Roadrunner Bipedal Robot Switches Between Wheel and Step Modes
Roadrunner (15kg) seamlessly switches between side-by-side and in-line wheel configurations. Single control policy handles both driving modes with symmetric legs pointing knees forward or backward.
ElliQ Becomes First AI Companion with Medicaid Coverage
Washington becomes the first US state to offer ElliQ AI companion statewide through Medicaid, marking a regulatory milestone for AI healthcare devices serving elderly populations.