Understood. That was part of my point. The other part I was talking about earlier is that it might depend a lot on the particular clock. Poor clocks are likely to jitter around and change more over time. Really, really good clocks are unlikely to change much no matter how long the delay, because their jitter is always very low.
In addition, I didn't mention some other factors that might come into the equation. For example: (1) dacs add their own jitter to any clock signal sent to them, and (2) many ADCs use internal PLLs for oversampling which may add its own jitter to any crystal reference clock fed into the ADC.
Moreover, part of the idea with oversampling dacs is that a few bits, maybe 5-bits, oversampled at a high enough frequency will average out to the equivalent of maybe 24-bits or so at a lower clock frequency. In that situation, the conversion to 24-bits takes place over multiple clock periods. Thus the effective clock period may correspond to a lower MCLK frequency. With every halving of MCLK frequency (doubling of period), in theory jitter (phase noise, actually) will be reduced by 6dB. Thus effective jitter for an analog signal coming out of the dac may be considerably less than the actual crystal MCLK jitter. However, for a one-bit dac such as Marcel's RTZ dac, the doubled BCLK jitter acts without the averaging effect of an oversampling mulit-bit dac (although it is a FIRDAC which can help some). So, not only may the effect of a delay depend on the clock quality, it may also depend on the specific dac architecture. Similar types of things for an ADC.
That's why I said the two error jitter model I described was simplistic; it was only to help develop some intuition as to causality.
Seems to me the obvious answer is probably to be able to use two independent clocks on different low noise power supplies (low noise so things like AC line harmonics don't cause unwanted correlation between clocks).
In addition, I didn't mention some other factors that might come into the equation. For example: (1) dacs add their own jitter to any clock signal sent to them, and (2) many ADCs use internal PLLs for oversampling which may add its own jitter to any crystal reference clock fed into the ADC.
Moreover, part of the idea with oversampling dacs is that a few bits, maybe 5-bits, oversampled at a high enough frequency will average out to the equivalent of maybe 24-bits or so at a lower clock frequency. In that situation, the conversion to 24-bits takes place over multiple clock periods. Thus the effective clock period may correspond to a lower MCLK frequency. With every halving of MCLK frequency (doubling of period), in theory jitter (phase noise, actually) will be reduced by 6dB. Thus effective jitter for an analog signal coming out of the dac may be considerably less than the actual crystal MCLK jitter. However, for a one-bit dac such as Marcel's RTZ dac, the doubled BCLK jitter acts without the averaging effect of an oversampling mulit-bit dac (although it is a FIRDAC which can help some). So, not only may the effect of a delay depend on the clock quality, it may also depend on the specific dac architecture. Similar types of things for an ADC.
That's why I said the two error jitter model I described was simplistic; it was only to help develop some intuition as to causality.
Seems to me the obvious answer is probably to be able to use two independent clocks on different low noise power supplies (low noise so things like AC line harmonics don't cause unwanted correlation between clocks).
Last edited:
Mark
Jitter (phase noise) broad topic is not something I tend to explore here now.
Only a tiny bit of it:
Possible altering the jitter cancellation when loopback testing a soundcard through adding time delay on the analog signal of DAC-out/ADC-in.
On a specific implementation frame:
Just one sound card with it's DAC and ADC (Kosta's V2).
A few RGB58 cables of various lengths connecting the analog DAC-out/ADC-in.
REW's J-test at 44k1/16bit
If something comes up from those few tests, I'll report here.
George
Jitter (phase noise) broad topic is not something I tend to explore here now.
Only a tiny bit of it:
Possible altering the jitter cancellation when loopback testing a soundcard through adding time delay on the analog signal of DAC-out/ADC-in.
On a specific implementation frame:
Just one sound card with it's DAC and ADC (Kosta's V2).
A few RGB58 cables of various lengths connecting the analog DAC-out/ADC-in.
REW's J-test at 44k1/16bit
If something comes up from those few tests, I'll report here.
George