Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Physics

Chapman University

Computational imaging

Publication Year

Articles 1 - 4 of 4

Full-Text Articles in Physical Sciences and Mathematics

Frequency-Modulated Continuous-Wave Lidar Compressive Depth-Mapping, Daniel J. Lum, Samuel H. Knarr, John C. Howell Jun 2018

Frequency-Modulated Continuous-Wave Lidar Compressive Depth-Mapping, Daniel J. Lum, Samuel H. Knarr, John C. Howell

Mathematics, Physics, and Computer Science Faculty Articles and Research

We present an inexpensive architecture for converting a frequency-modulated continuous-wave LiDAR system into a compressive-sensing based depth-mapping camera. Instead of raster scanning to obtain depth-maps, compressive sensing is used to significantly reduce the number of measurements. Ideally, our approach requires two difference detectors. Due to the large flux entering the detectors, the signal amplification from heterodyne detection, and the effects of background subtraction from compressive sensing, the system can obtain higher signal-to-noise ratios over detector-array based schemes while scanning a scene faster than is possible through raster-scanning. Moreover, by efficiently storing only 2m data points from m < n measurements of an n pixel scene, we can easily extract depths by solving only two linear equations with efficient convex-optimization methods.


Fast Hadamard Transforms For Compressive Sensing Of Joint Systems: Measurement Of A 3.2 Million-Dimensional Bi-Photon Probability Distribution, Daniel J. Lum, Samuel H. Knarr, John C. Howell Oct 2015

Fast Hadamard Transforms For Compressive Sensing Of Joint Systems: Measurement Of A 3.2 Million-Dimensional Bi-Photon Probability Distribution, Daniel J. Lum, Samuel H. Knarr, John C. Howell

Mathematics, Physics, and Computer Science Faculty Articles and Research

We demonstrate how to efficiently implement extremely high-dimensional compressive imaging of a bi-photon probability distribution. Our method uses fast-Hadamard-transform Kronecker-based compressive sensing to acquire the joint space distribution. We list, in detail, the operations necessary to enable fast-transform-based matrix-vector operations in the joint space to reconstruct a 16.8 million-dimensional image in less than 10 minutes. Within a subspace of that image exists a 3.2 million-dimensional bi-photon probability distribution. In addition, we demonstrate how the marginal distributions can aid in the accuracy of joint space distribution reconstructions.


Compressive Wavefront Sensing With Weak Values, Gregory A. Howland, Daniel J. Lum, John C. Howell Jun 2014

Compressive Wavefront Sensing With Weak Values, Gregory A. Howland, Daniel J. Lum, John C. Howell

Mathematics, Physics, and Computer Science Faculty Articles and Research

We demonstrate a wavefront sensor that unites weak measurement and the compressive-sensing, single-pixel camera. Using a high-resolution spatial light modulator (SLM) as a variable waveplate, we weakly couple an optical field’s transverse-position and polarization degrees of freedom. By placing random, binary patterns on the SLM, polarization serves as a meter for directly measuring random projections of the wavefront’s real and imaginary components. Compressive-sensing optimization techniques can then recover the wavefront. We acquire high quality, 256 × 256 pixel images of the wavefront from only 10,000 projections. Photon-counting detectors give sub-picowatt sensitivity.


Photon Counting Compressive Depth Mapping, Gregory A. Howland, Daniel J. Lum, Matthew R. Ware, John C. Howell Sep 2013

Photon Counting Compressive Depth Mapping, Gregory A. Howland, Daniel J. Lum, Matthew R. Ware, John C. Howell

Mathematics, Physics, and Computer Science Faculty Articles and Research

We demonstrate a compressed sensing, photon counting lidar system based on the single-pixel camera. Our technique recovers both depth and intensity maps from a single under-sampled set of incoherent, linear projections of a scene of interest at ultra-low light levels around 0.5 picowatts. Only two-dimensional reconstructions are required to image a three-dimensional scene. We demonstrate intensity imaging and depth mapping at 256 × 256 pixel transverse resolution with acquisition times as short as 3 seconds. We also show novelty filtering, reconstructing only the difference between two instances of a scene. Finally, we acquire 32 × 32 pixel real-time video for …