Article ID: 16.20190218
This study demonstrates the design and theoretical analysis of a clock jitter reduction circuit that exploits the phase blending technique between the uncorrelated clock edges that are self-delayed by multiples of the clock cycle, nT. By blending uncorrelated clock edges, the output clock edges approach the ideal timing and, thus, timing jitter can be reduced by a factor of √2 per stage. There are three technical challenges to realize this: 1) generating uncorrelated clock edges, 2) phase averaging with small time offset from the ideal center position, and 3) minimizing the error in nT-delay being deviated from ideal nT. The proposed circuit overcomes each of these by exploiting an nT-delay, gated phase blending, and self-calibrated nT-delay elements, respectively. Measurement results with a 180-nm CMOS prototype chip demonstrated an approximately four-fold reduction in timing jitter from 30.2 ps to 8.8 ps in 500-MHz clock by cascading the proposed circuit with four-stages. Theoretical analysis for evaluating the limit of jitter reduction is also presented.