An artistic rendering of the airborne photoacoustic sonar system that works from a drone to capture and image underwater objects. Photo credit: Kindea Labs
The “Photoacoustic Airborne Sonar System” could be installed under drones to enable underwater images from the air and high-resolution mapping of the deep sea.
Stanford University engineers developed a method in the air to image underwater objects by combining light and sound to break the seemingly impassable barrier at the interface between air and water.
The researchers envision that one day their hybrid optical-acoustic system will be used to conduct drone-based aerial biological marine surveys, conduct large-scale aerial searches for sunken ships and aircraft, and map the depths of the ocean at similar speeds and altitudes Detail as landscapes of the earth. Their “Photoacoustic Airborne Sonar System” is described in detail in a study recently published in the journal IEEE access.
“Radar and laser systems (LIDAR) in the air and in space have been able to map the earth’s landscapes for decades. Radar signals can even penetrate the cloud cover and canopy. However, seawater is far too absorbent to be mapped into the water, ”said study leader Amin Arbabian, associate professor of electrical engineering at the Stanford School of Engineering. “Our goal is to develop a more robust system that also provides an image through cloudy water.”
Loss of energy
Oceans cover about 70 percent of the earth’s surface, but only a small part of their depths has been subjected to high-resolution imaging and mapping.
The main barrier has to do with physics: sound waves, for example, cannot get from the air into the water or vice versa without losing the largest part – more than 99.9 percent – of their energy through reflection against the other medium. A system that attempts to see underwater using sound waves that travel from air to water and back into the air is exposed to this loss of energy twice – which leads to an energy saving of 99.9999 percent.
Similarly, electromagnetic radiation – an umbrella term that includes light, microwave, and radar signals – also loses energy when moving from one physical medium to another, although the mechanism is different from sound. “Light also loses some energy through reflection, but most of the energy loss is due to this absorption on the water, ”said the study’s lead author, Aidan Fitzpatrick, a Stanford PhD student in electrical engineering. Incidentally, this absorption is also the reason why sunlight cannot penetrate the depths of the ocean and why your smartphone – which is based on cellular signals, a form of electromagnetic radiation – cannot receive calls underwater.

The experimental setup of the photoacoustic airborne sonar system in the laboratory (left). A Stanford “S” submerged under water (center) is reconstructed in 3D with reflected ultrasound waves (right). Photo credit: Aidan Fitzpatrick
The result of all this is that oceans cannot be mapped from air and space in the same way that land can be mapped. To date, most underwater mapping has been accomplished by attaching sonar systems to ships towing a particular region of interest. However, this technique is slow and costly and inefficient for covering large areas.
An invisible puzzle
Step into the Photoacoustic Airborne Sonar System (PASS), which combines light and sound to break the air-water interface. The idea for this came from another project in which microwaves were used for non-contact imaging and characterization of underground plant roots. Some of PASS’s instruments were originally developed for this purpose in collaboration with the laboratory of Stanford Professor of Electrical Engineering, Butrus Khuri-Yakub.
At its core, PASS plays with the individual strengths of light and sound. “If we can use light in the air, where light travels well, and sound in water, where sound travels well, we can get the best of both worlds,” said Fitzpatrick.
To do this, the system first fires a laser from the air that is absorbed on the surface of the water. When the laser is absorbed, it creates ultrasonic waves that travel through the column of water and are reflected off underwater objects before they race back to the surface.
The returning sound waves still use most of their energy as they break through the surface of the water. However, by generating the sound waves underwater with lasers, the researchers can prevent the energy loss from occurring twice.
“We have developed a system that is sensitive enough to make up for a loss of this magnitude and still enable signal detection and imaging,” said Arbabian.

An animation showing the 3D image of the submerged object recreated using reflected ultrasonic waves. Photo credit: Aidan Fitzpatrick
The reflected ultrasonic waves are recorded by instruments called transducers. The software is then used to reassemble the acoustic signals like an invisible puzzle and to reconstruct a three-dimensional image of the submerged feature or object.
“Similar to how light refracts or” bends “when it passes through water or a medium that is denser than air, ultrasound refracts,” Arbabian explained. “Our image reconstruction algorithms correct this bending that occurs when the ultrasonic waves travel from the water to the air.”
Drone ocean surveys
Conventional sonar systems can penetrate to depths of hundreds to thousands of meters, and researchers expect their system to potentially reach similar depths.
So far, PASS has only been tested in the laboratory in a container the size of a large aquarium. “Current experiments use static water, but we are currently working on dealing with water waves,” said Fitzpatrick. “This is a challenge, but we think a problem is feasible.”
The next step, the researchers say, will be to conduct tests in a larger setting, and eventually in an open-water setting.
“Our vision for this technology is on board a helicopter or a drone,” said Fitzpatrick. “We expect the system to be able to fly ten meters above the water.”
Reference: “An Airborne Sonar System for Underwater Remote Sensing and Imaging” by Aidan Fitzpatrick, Ajay Singhvi and Amin Arbabian, October 16, 2020, IEEE Explore.
DOI: 10.1109 / ACCESS.2020.3031808