Article ID: 19.20210571
This paper presents a digital background technique that intentionally exploits multiple dithers to calibrate conversion errors caused by interstage gain error and nonlinearity in high speed pipelined analog-to-digital converters (ADCs). Two independent, zero-mean pseudo-random signals are injected into the multiplying digital-to-analog converter (MDAC) alternately to estimate these errors. Least mean square (LMS) algorithms are adopted to quickly locate and track the calibration parameters. Simulation results are presented for a 800 MSps 12-bit pipelined ADC in a 40 nm CMOS technology, similar to that described by Murmann and Boser [1], Keane et al. [2] and Nan Sun [3] using low-gain amplifiers. With calibration, the signal-to-noise-distortion ratio (SNDR) and spurious-free dynamic range (SFDR) are improved from 32.74 dB and 43.61 dB to 70.54 dB and 89.8 dB, respectively. The proposed calibration technique has the advantages of simple implementation, arbitrary amplitude pseudo-random signals, and no restriction on the input signal of the ADC.