Well... don't you think they have a point (even if they don't tell exactly how they achieve that)? Jitter is really not relevant in the digital chain?
Jitter is very relevant, but any competent gear will already have taken steps to reduce it. Their aim seems to be to take what may be a small problem, and make it look like a big problem by pretending that everyone else is ignoring it.
For example, they claim that PLL in receivers does not reduce jitter and may increase it. PLL reduces the most likely jitter (high freq, caused by reflections etc.) and leaves untouched the low frequency jitter - for that you have to rely on the clock in your source. There may be a small region between these where jitter is increased by a small amount - this can be a problem with any servo loop. They make this sound like a crisis. So, like any good salesman or politician, nothing they say is actually a lie but it leaves the misleading impression they want to make.
For example, they claim that PLL in receivers does not reduce jitter and may increase it. PLL reduces the most likely jitter (high freq, caused by reflections etc.) and leaves untouched the low frequency jitter - for that you have to rely on the clock in your source. There may be a small region between these where jitter is increased by a small amount - this can be a problem with any servo loop. They make this sound like a crisis. So, like any good salesman or politician, nothing they say is actually a lie but it leaves the misleading impression they want to make.
You're taking the concept to an illogical extreme. That's not what I meant by "slaving the media source to the DAC." For one thing, the CD is dead as a media source in this day and age, and thus random access media is increasingly common. All of my listening involves a computer between the media source and DAC, and thus the computer is in full control of the media transport. This is true even when I put an audio CD in the tray.That is an urban legend. If you try to rigidly "lock" the optical media player to the DAC xtall, how can you adjust for the laser pickup iregularities? The spindle motor will not spin perfect all the time, tracking will not be perfect all the time, all those induce small variations in instanteneous rotational speed and therfore jitter in the data stream. Uless the transport has some kind of buffer memory and mechanism to use it (like an IDE optical drive that might have 2MB cache), you cannot directly sync the pickup mechanism with a fixed external frequency. Not without a PLL loop. All the cipsets used to drive optical trasports have that PLL loop integrated. So just slaving the transport with the external DAC alone will NOT give better results than internal xtall - unless the optical drive itself has cache memory (I have only one DVD player that uses an IDE drive - DSL 710A).
So the main issue is the buffer memory and how big is necessary to be. Slaving is not solving the jitter problem.
CD was designed as a 1x rate media source, but I did not intend to imply that it should be literally slaved to the DAC clock. That would be a severely limited design. A better approach would be to run the CD transport at a minimum of 2x, with a buffer in between so that the transport can retry on any read errors for better quality. A computer can run the CD at extreme speed multiples and retry until error free reads result.
But you are also mistaken about buffer memory being a cure-all. Buffer memory only works under all conditions when the DAC is master, and with a communication channel back to the source for flow control. If you try to push data into a buffer memory with the media as the master clock, then you will still run into cases where samples are lost or repeated. Sure, you can buffer the entire CD to avoid this, but that involves much more latency than is necessary.
Agreed. Quantization noise is not directly tied with signal-to-noise ratio in all situations. Also, correlated quantization noise is way more audible than uncorrelated quantization noise. Sorry for my slackness in terminology.Its also misleading to characterise a digital audio signal chain in terms of 'resolution' (though it is indeed very popular to do so).
Interesting point. The distortions would certainly be multiples of the new clock frequency, wouldn't they? ... even if they are not multiples of the original signal frequencies. But any frequency modulation results in sum and difference frequencies which could be harmonic or inharmonic, although jitter would imply that the modulation frequency is not steady, and thus the modulation is not constant.That they were not present in the original bits is undisputed. However to continue to use the term 'harmonics' when by no stretch of imagination could they be at integer multiples of the original frequency is just propagating misleading terminology.
Works for me. I would also accept 'noise' as an accurate descriptive term.How about 'unwanted additional frequencies' ?
Last edited:
Haven't you read the white paper showing the mathematical proof that 1-bit sampling cannot be properly dithered? There must be more than two states, or at least one of the states must be zero (not +1 and -1 as in single-bit), or else the dithering adds massive noise because there is no such thing as a steady-state (DC) in a system with only +1 and -1.So then, according to you an SAA7350/TDA1547 DAC has only one bit resolution?
Last edited:
Multibit converters are equally capable of using out of band dither. It's done all the time. Single-bit delta-sigma just produces an incredibly greater amount of noise.As far as I can see, the only difference is the dither - with bitstream the dither is out of band whereas with multibit, its in-band.
Hello
Wen using a SPDIF receiver and a R2R dac chip, does it worth it to reclock the WS, BCK, and DATA after the digital filter ?
Thank
Bye
Gaetan
Wen using a SPDIF receiver and a R2R dac chip, does it worth it to reclock the WS, BCK, and DATA after the digital filter ?
Thank
Bye
Gaetan
Jitter is really not relevant in the digital chain?
It is indeed not relevant in the digital chain. It only becomes relevant at the digital-analog gateway. Elsewhere the timing of the data is unimportant so long as it gets transferred correctly. So really gross jitter may well affect synchronisation in transmission and hence corrupt data. But that's the only way jitter becomes relevant before the D/A.
If you only mean to burst a tiny fraction of a second of audio, then that's exactly how computer audio works everywhere (as opposed to CD player and SPDIF standards).
The majority of USB audio is implemented as isochronous mode. A small buffer is used in isochronous mode but it is a FIFO. The information is not delivered in bursts. although the rate at which it is delivered may fluctuate. The clock in the DAC must synch with the transmit clock which is controlling the overall bitrate. In this sense, adaptive mode USB is indistinguishable from SPDIF
I'm talking about a true bursty system where the buffer filling is controlled by the receiver and the download bitrate greatly exceeds the playback bitrate, which I understand is the case with asynch USB, although I haven't dug into it that deeply. If it doesn't break the link between tx and rx clocks though, what's the point?
Yes there is latency involved, but there is latency with adaptive mode USB, and as data rates increase, larger buffers become acceptable.
Jitter is just as important as the difference between 16-bit and 24-bit. Of course, if your DAC is 16-bit to start with, then jitter could reduce its effective quantization performance far below 16-bit accuracy. To say that jitter doesn't really make any difference is almost exactly the same as saying it doesn't matter whether you use a 12-bit DAC or 24-bit DAC.
There is no doubt that jitter can make a difference. The question is; how much jitter is audible?
You have to consider whether the people recommending vanishingly small levels of jitter have a vested interest and the fact that most existing systems have a performance exceeding the 10-20nS jitter which has been shown to be audible. I'm not saying smaller levels aren't audible, I'm just saying that smaller levels haven't been demonstrated to be audible.
There's engineering and there's over-engineering. Ask any aircraft manufacturer. To say something is over-engineered is not a compliment in my book.
w
Haven't you read the white paper showing the mathematical proof that 1-bit sampling cannot be properly dithered?
Yes I did read it at the time it came out - Lipshitz and Vanderkooy. I'm guessing Sony (perhaps Peter Eastty) who were the inventors of DSD have a rebuttal depending on the precise meaning of 'properly' 😛
Yes - a quick check on Wikipedia shows that James Angus has written in reply (he was a co-creator with Peter Eastty). As far as I'm aware, Sony don't try to process the DSD signal in one-bit form, that's only used for the distribution format.
Wen using a SPDIF receiver and a R2R dac chip, does it worth it to reclock the WS, BCK, and DATA after the digital filter ?
Taking the PCM1702 as an example (because I'm currently dreaming up a design with this chip - its nice and cheap) the only signal worth reclocking is the BCK. The update of data inside the chip is synchronised with this signal. Whether any signal is worth reclocking depends on the jitter from your SPDIF receiver. Plenty of receivers I've worked with introduce a lot more jitter than a digital filter chip, so I think there's no point unless a secondary crystal-based PLL is used beyond the PLL in the receiver chip itself.
Multibit converters are equally capable of using out of band dither. It's done all the time.
I wasn't thinking of the converters, just whether the dither was in-band encoded in the datastream. Not much room for out of band dither in a 44k1 PCM datastream for example😛
Single-bit delta-sigma just produces an incredibly greater amount of noise.
Tell me about it 😱
I guess I need to specify that in my previous statement, the digital CHAIN include the DAC?It is indeed not relevant in the digital chain. It only becomes relevant at the digital-analog gateway.

Of course you have the effects translated in the analog domanin at the end of the chain, but the whole chain contribues to jitter creation.
The majority of USB audio is implemented as isochronous mode. A small buffer is used in isochronous mode but it is a FIFO. The information is not delivered in bursts.
Waki I suggest you update your understanding of USB with this posting: Computer Audio Asylum - USB audio spec and jitter - John Swenson, November 11, 2005 at 14:51:46
There's a 1kHz frame rate, so yes the data is delivered in bursts. And its generally considerably worse than SPDIF as regards jitter as the PLL needs to multiply up by a larger factor.
Ah, well, that really depends on your understanding of bursty download. I am in fact aware that the data is in 1kHz frames and I considered that before I wrote the above.
It is only in synchronous mode, however that the clock is derived from the 1kHz frame rate, I am talking about adaptive mode, which is generally recognised to be superior to synchronous mode, and which derives its clock from the bitrate, I quote from the article you referenced:
'A control circuit (either hardware or firmware running on an embedded processor) measures the average rate of the DATA coming over the bus and adjusts the clock to match that.'
USB can in fact exceed the performance of SPDIF, see this article on 6 moons: 6moons audio reviews: Wavelength Audio Brick USB DAC
Thanks for the pointer to the Swenson article though, it's always useful to have a second reference and his text is refreshingly clear.
w
It is only in synchronous mode, however that the clock is derived from the 1kHz frame rate, I am talking about adaptive mode, which is generally recognised to be superior to synchronous mode, and which derives its clock from the bitrate, I quote from the article you referenced:
'A control circuit (either hardware or firmware running on an embedded processor) measures the average rate of the DATA coming over the bus and adjusts the clock to match that.'
USB can in fact exceed the performance of SPDIF, see this article on 6 moons: 6moons audio reviews: Wavelength Audio Brick USB DAC
Thanks for the pointer to the Swenson article though, it's always useful to have a second reference and his text is refreshingly clear.
w
USB can in fact exceed the performance of SPDIF, see this article on 6 moons: 6moons audio reviews: Wavelength Audio Brick USB DAC
I'm surprised that you would read that article and quote a 'conclusion' from it with a straight face 😉 Gordon Rankin is selling USB DACs. I haven't found 6moons to be a reliable source of technical information, though it is sometimes interesting for reviews.
Just as one example from that article - it claims:
...it is possible for the order of packets to be disrupted by other USB packets.
Putting another packet between audio packets does nothing whatsoever to disrupt the order of audio packets. Someone's spreading FUD 😀
...it is possible for the order of packets to be disrupted by other USB packets.
Putting another packet between audio packets does nothing whatsoever to disrupt the order of audio packets. Someone's spreading FUD 😀
How about if one of the packets is corrupted? Then it will be retransmitted at a latter date, no?
Not sure that USB audio has the facility to do packet checking and re-transmission. Certainly that's possible for bulk transfers, but with audio the time delays involved are such that getting a re-transmission in time to avoid drop-outs is highly unlikely. Anyone else able to confirm my suspicions?
- Status
- Not open for further replies.
- Home
- Source & Line
- Digital Source
- What am I missing (async reclocking)?