Abstract

Calibration of time-interleaved analog-to-digital converters is a problem whose necessity and complexity increase with the number of interleaved channels. In this study, we develop a generic representation of the referenceless timing mismatch calibration scheme for N-channel TI-ADCs. We compare cross-correlation and mean absolute difference based approaches, and investigate the effect of increasing number of channels on the performance. We use both mathematical analyses and simulations to reveal degradation mechanisms, and discuss the extent to which this scheme is applicable.

Details

Actions