We’re not quite at Superman’s level of X-ray vision yet, but researchers at Stanford University have taken us one step closer with laser light and an algorithm that’s able to leap foamy barriers in a single bound (well, maybe multiple calculations, but that doesn’t sound as good).
In creating the penetrating vision system, the researchers gave a slight upgrade to hardware similar to that which is already being used as guidance systems in autonomous cars. Those systems rely on a technology known as LiDAR, or light detection and ranging, which sends laser pulses to the area surrounding the vehicle and then measures the time it takes for the pulses to come back, building a picture of the world around it. The problem is, when it’s exceptionally foggy out, those pulses can scatter, and the system becomes less effective.
Seeking to overcome the problem of scattering, the researchers paired a laser with a sensor comprised of single-photon avalanche diodes—a device that can pick up every single photon that reaches it. They then linked the system to a processor armed with an algorithm that can take just a few photons and extrapolate the shape of the object they bounce off.
“These sensing systems are devices with lasers, detectors, and advanced algorithms, which puts them in an interdisciplinary research area between hardware and physics and applied math,” said Gordon Wetzstein, assistant professor of electrical engineering at Stanford and senior author of the paper. “All of those are critical, core fields in this work, and that’s what’s the most exciting for me.”
To test the system, the researchers put a mannequin and various letter shapes behind a foam wall that was one-inch thick and shot their laser at it. Because a few photons were able to bounce off the object and travel back through the foam wall, the sensors were able to capture them, and the algorithm was able to successfully rebuild the object, essentially seeing through the foam.
“You couldn’t see through the foam with your own eyes, and even just looking at the photon measurements from the detector, you don’t see anything,” said David Lindell, a graduate student in electrical engineering and lead author of the paper. “But, with just a handful of photons, the reconstruction algorithm can expose these objects—and you can see not only what they look like, but where they are in 3D space.”
Perhaps a View of a Krypton-Like Planet One Day?
The researchers hope that their invention can someday be scaled up to not only help autonomous cars find their way home on a foggy night but perhaps to help spacecraft see through dense planetary atmospheres to reveal what lies beneath them. As a next step, they plan to simulate other environments that scatter light to explore the technology’s potential.
“We’re excited to push this further with other types of scattering geometries,” said Lindell. “So, not just objects hidden behind a thick slab of material but objects that are embedded in densely scattering material, which would be like seeing an object that’s surrounded by fog.”
The work has been published in the peer-reviewed journal, Nature Communications.