IEICE Electronics Express
Online ISSN : 1349-2543
ISSN-L : 1349-2543

This article has now been updated. Please use the final version.

Design and Theoretical Analysis of a Clock Jitter Reduction Circuit Using Gated Phase Blending Between Self-Delayed Clock Edges
Kiichi NiitsuOsamu KobayashiTakahiro J. YamaguchiHaruo Kobayashi
Author information
JOURNAL FREE ACCESS Advance online publication

Article ID: 16.20190218

Details
Abstract

This study demonstrates the design and theoretical analysis of a clock jitter reduction circuit that exploits the phase blending technique between the uncorrelated clock edges that are self-delayed by multiples of the clock cycle, nT. By blending uncorrelated clock edges, the output clock edges approach the ideal timing and, thus, timing jitter can be reduced by a factor of √2 per stage. There are three technical challenges to realize this: 1) generating uncorrelated clock edges, 2) phase averaging with small time offset from the ideal center position, and 3) minimizing the error in nT-delay being deviated from ideal nT. The proposed circuit overcomes each of these by exploiting an nT-delay, gated phase blending, and self-calibrated nT-delay elements, respectively. Measurement results with a 180-nm CMOS prototype chip demonstrated an approximately four-fold reduction in timing jitter from 30.2 ps to 8.8 ps in 500-MHz clock by cascading the proposed circuit with four-stages. Theoretical analysis for evaluating the limit of jitter reduction is also presented.

Content from these authors
© 2019 by The Institute of Electronics, Information and Communication Engineers
feedback
Top