Stanford Engineers Combine Light and Sound to See Underwater

Typography

Stanford University engineers have developed an airborne method for imaging underwater objects by combining light and sound to break through the seemingly impassable barrier at the interface of air and water.

Stanford University engineers have developed an airborne method for imaging underwater objects by combining light and sound to break through the seemingly impassable barrier at the interface of air and water.

The researchers envision their hybrid optical-acoustic system one day being used to conduct drone-based biological marine surveys from the air, carry out large-scale aerial searches of sunken ships and planes, and map the ocean depths with a similar speed and level of detail as Earth’s landscapes. Their “Photoacoustic Airborne Sonar System” is detailed in a recent study published in the journal IEEE Access.

“Airborne and spaceborne radar and laser-based, or LIDAR, systems have been able to map Earth’s landscapes for decades. Radar signals are even able to penetrate cloud coverage and canopy coverage. However, seawater is much too absorptive for imaging into the water,” said study leader Amin Arbabian, an associate professor of electrical engineering in Stanford’s School of Engineering. “Our goal is to develop a more robust system which can image even through murky water.”

Energy loss

Oceans cover about 70 percent of the Earth’s surface, yet only a small fraction of their depths have been subjected to high-resolution imaging and mapping.

Read more at Stanford School of Engineering

Image: An artist's rendition of the photoacoustic airborne sonar system operating from a drone to sense and image underwater objects. (Credit: Kindea Labs)