Open Access. Powered by Scholars. Published by Universities.®
![Digital Commons Network](http://assets.bepress.com/20200205/img/dcn/DCsunburst.png)
Physical Sciences and Mathematics Commons™
Open Access. Powered by Scholars. Published by Universities.®
Articles 1 - 2 of 2
Full-Text Articles in Physical Sciences and Mathematics
Frequency-Modulated Continuous-Wave Lidar Compressive Depth-Mapping, Daniel J. Lum, Samuel H. Knarr, John C. Howell
Frequency-Modulated Continuous-Wave Lidar Compressive Depth-Mapping, Daniel J. Lum, Samuel H. Knarr, John C. Howell
Mathematics, Physics, and Computer Science Faculty Articles and Research
We present an inexpensive architecture for converting a frequency-modulated continuous-wave LiDAR system into a compressive-sensing based depth-mapping camera. Instead of raster scanning to obtain depth-maps, compressive sensing is used to significantly reduce the number of measurements. Ideally, our approach requires two difference detectors. Due to the large flux entering the detectors, the signal amplification from heterodyne detection, and the effects of background subtraction from compressive sensing, the system can obtain higher signal-to-noise ratios over detector-array based schemes while scanning a scene faster than is possible through raster-scanning. Moreover, by efficiently storing only 2m data points from m < n measurements of an n pixel scene, we can easily extract depths by solving only two linear equations with efficient convex-optimization methods.
Photon Counting Compressive Depth Mapping, Gregory A. Howland, Daniel J. Lum, Matthew R. Ware, John C. Howell
Photon Counting Compressive Depth Mapping, Gregory A. Howland, Daniel J. Lum, Matthew R. Ware, John C. Howell
Mathematics, Physics, and Computer Science Faculty Articles and Research
We demonstrate a compressed sensing, photon counting lidar system based on the single-pixel camera. Our technique recovers both depth and intensity maps from a single under-sampled set of incoherent, linear projections of a scene of interest at ultra-low light levels around 0.5 picowatts. Only two-dimensional reconstructions are required to image a three-dimensional scene. We demonstrate intensity imaging and depth mapping at 256 × 256 pixel transverse resolution with acquisition times as short as 3 seconds. We also show novelty filtering, reconstructing only the difference between two instances of a scene. Finally, we acquire 32 × 32 pixel real-time video for …