Files

Abstract

Among advanced electronic designs, vision systems are perhaps those that have undergone the most dramatic technological revolutions of the last decades. Recently, a new family of vision systems has emerged aimed at three-dimensional (3D) imaging. Many applications exist or will soon come to existence that can use 3D image sensors and the image sensor community is responding with conventional and unconventional solutions. This thesis presents a new concept for imaging and, in particular, for 3D imaging, based on single-photon detection. In our approach, single-photon detection is performed by a device known as single-photon avalanche diode (SPAD). Large arrays of SPADs were demonstrated for the first time in this thesis and time resolutions consistently in the picosecond range were achieved. It has thus become feasible to design solid-state 3D imagers with millimeter depth perception based on the time-of-flight principle. Initially, large arrays of SPAD devices and associated circuitry have been investigated in a 0.8µm CMOS technology. Extremely low-noise detectors were implemented in this technology. In arrays of 64x48 pixels, the median dark count rate was 0.5Hz per square micrometer of active area. For these devices, the maximum photon detection probability was 26% at 550nm optical wavelength. Subsequently, a technology migration of SPADs towards deep-submicron CMOS has been successfully achieved. For the first time, SPADs in 0.35µm and 0.13µm CMOS technologies were designed and characterized. Outstanding photon detection probabilities up to 35.4% and 34.5% have been measured in devices manufactured in 0.35µm and 0.13µm CMOS, respectively. Median dark count rate has increased with technology scaling. The median dark count rate over an array of 128x128 pixels was 18 Hz/µm2 in 0.35µm CMOS, whereas it was 385Hz/µm2 in 0.13µm CMOS for a single device. With the aim of enabling high-performance system-on-a-chip implementations using single-photon detectors, appropriate front-end and ancillary circuits utilized in the detection, measurement, and storage of time-of-flight evaluations have been introduced. For the first time, passive quenching circuits using a single MOS transistor were investigated. Based on the proposed biasing regime, the single MOS quenching circuit provides well defined dead time and leads to high pixel fill factor. Active recharge is achieved in this thesis by means of a new dual-threshold quenching and recharge circuit. This circuit allows the implementation of a hold-off time using low in-pixel transistor count, thus improving afterpulsing probability with little or no penalty in fill factor. 3D image sensors based on a pulsed detection technique known as time-correlated single-photon counting (TCSPC) have been investigated. Moreover, a technique has also been investigated for range imaging using SPADs that operate with continuously modulated optical signals. This technique, known as single-photon synchronous detection (SPSD), was invented in the course of this thesis and it is introduced here for the first time. In order to demonstrate the potential of TCSPC in solid-state 3D imaging, a complete theoretical and experimental investigation was conducted. An analytical model for the evaluation of ranging performance based on TCSPC was introduced. A fully-integrated TCSPC system for single-photon time-of-flight evaluation was implemented in CMOS, for the first time. Depth maps of 3D scenes were achieved at millimeter precisions using only 1mW of laser power and an integration time of 50ms. Thanks to SPAD technology, accurate distance measurements are now possible even with extremely low photon count rates of a few thousand per second. The maximum non-linearity in distance measurement was 9 millimeters over the full measurement range. Time-varying uncertainty (1σ) at the farthest distance was 5.2 millimeters. The SPSD approach has been theoretically and experimentally investigated. The design of the first fully-parallel implementation of a single-photon image sensor in CMOS has been introduced. The sensor has enabled the acquisition of real-time 3D images, based on CMOS SPADs, for the first time. A 3D camera prototype was designed and built based on the SPSD image sensor with a field-of-view of 50°. Experimental results showed that the SPSD rangefinder is effective. Distance measurement performance was characterized with a maximum non-linearity error of 11cm within a range of a few meters. In the same range, the maximum repeatability error was 3.8cm.

Details

Actions