IEICE Electronics Express
Online ISSN : 1349-2543
ISSN-L : 1349-2543
LETTER
Design and theoretical analysis of a clock jitter reduction circuit using gated phase blending between self-delayed clock edges
Kiichi NiitsuOsamu KobayashiTakahiro J. YamaguchiHaruo Kobayashi
Author information
JOURNAL FREE ACCESS

2019 Volume 16 Issue 13 Pages 20190218

Details
Abstract

This study demonstrates the design and theoretical analysis of a clock jitter reduction circuit that exploits the phase blending technique between the uncorrelated clock edges that are self-delayed by multiples of the clock cycle, nT. By blending uncorrelated clock edges, the output clock edges approach the ideal timing and, thus, timing jitter can be reduced by a factor of √2 per stage. There are three technical challenges to realize this: 1) generating uncorrelated clock edges, 2) phase averaging with small time offset from the ideal center position, and 3) minimizing the error in nT-delay being deviated from ideal nT. The proposed circuit overcomes each of these by exploiting an nT-delay, gated phase blending, and self-calibrated nT-delay elements, respectively. Measurement results with a 180-nm CMOS prototype chip demonstrated an approximately four-fold reduction in timing jitter from 30.2 ps to 8.8 ps in 500-MHz clock by cascading the proposed circuit with four-stages. Theoretical analysis for evaluating the limit of jitter reduction is also presented.

Content from these authors
© 2019 by The Institute of Electronics, Information and Communication Engineers
Previous article Next article
feedback
Top