IEICE Electronics Express
Online ISSN : 1349-2543
ISSN-L : 1349-2543

This article has now been updated. Please use the final version.

A Digital Background Calibration Technique for Interstage Gain Nonlinearity in Pipelined ADCs
Bowen DingPeng MiaoFei LiWeiqi Gu
Author information
JOURNAL FREE ACCESS Advance online publication

Article ID: 19.20210571

Details
Abstract

This paper presents a digital background technique that intentionally exploits multiple dithers to calibrate conversion errors caused by interstage gain error and nonlinearity in high speed pipelined analog-to-digital converters (ADCs). Two independent, zero-mean pseudo-random signals are injected into the multiplying digital-to-analog converter (MDAC) alternately to estimate these errors. Least mean square (LMS) algorithms are adopted to quickly locate and track the calibration parameters. Simulation results are presented for a 800 MSps 12-bit pipelined ADC in a 40 nm CMOS technology, similar to that described by Murmann and Boser [1], Keane et al. [2] and Nan Sun [3] using low-gain amplifiers. With calibration, the signal-to-noise-distortion ratio (SNDR) and spurious-free dynamic range (SFDR) are improved from 32.74 dB and 43.61 dB to 70.54 dB and 89.8 dB, respectively. The proposed calibration technique has the advantages of simple implementation, arbitrary amplitude pseudo-random signals, and no restriction on the input signal of the ADC.

Content from these authors
© 2022 by The Institute of Electronics, Information and Communication Engineers
feedback
Top