is a resolution of 0,001 UI good to measure clock jitter ?

Status
Not open for further replies.
0.001 UI is -60 dB peak-to-peak jitter phase noise, related to the fundamental frequency. We need half of 1/65536 UI, that is 0.75 x 10-6 UI in telecom terms. UI is for Unit Interval, referring to the data clock period. That would be the jitter causing 1/2 LSB error. And we need it in the full audio band, so a jitter spectrum is more informative than such cumulative data.
 
Status
Not open for further replies.