CD players that have syncronized left and right channel

Status
Not open for further replies.
I was wondering how many CD players actually have syncronized left and right channels from after data is read to D/A conversion. I suspect that most players interleave between left and right channels during processing causing a slight difference in timing between the channels, but not really sure.
 
That was true in 1985, but I suspect not these days, at least with CD players having any pretensions to quality. I haven't surveyed the market, but the three CD players I have in-house ($200-700) all have separate DACs for left and right. One of them takes it even farther and has two separate DACs for each channel.
 
It would be very unlikely that the channels would be synchronised in this manner, even in a modern DAC.

In the very early CD players (think CDP-101) the DAC was a pretty expensive component and it was not uncommon to use a single DAC and a switched capacitor sample and hold to multiplex the DAC output between the two channels. This led to some hand wringing about the 11us delay between channels. The question is whether this time offset still exists.

The answer will depend. Pretty much any design using discrete DACs for each channel will retain the delay - simply because the word clock is derived from the L/R bit of the input data stream - and all other internal timing will be dependant upon that. A stereo DAC will depend upon the internal architecture. However it seems very unlikely that any designer would bother to add the needed buffer. Simply because it does not matter. At all.

I am reminded of a design I saw a year ago from a very sincere enthusiast that attempted to place a suitable shift register in dual DAC design (needless to say a NOS design). Some quite reasonable digital design, and no attempt at all to manage the clocks. There were more opportunities for introduction of noise on the clock than you could imagine - and yet the designer and another enthusiast who had built the design - stood by its superiority. Since it sounded different to other DACs, it clearly must have been better. There was a fundamental misunderstanding of the issues in the design - with a very vocal argument that the delay between the channels was equivalent to a 44.1 kHz modulation of the position of one speaker. No amount of reasoning could convince them that a simple inter-channel delay is the equivalent of a 3.8mm static shift in the position of one speaker.
 
Francis_Vaughan said:


Pretty much any design using discrete DACs for each channel will retain the delay - simply because the word clock is derived from the L/R bit of the input data stream - and all other internal timing will be dependant upon that.


Well, I've built a few and I own a few dacs and none of them have the delay. Come to think it, nos aside, I doubt there has been a cdp/dac built with the delay since the late 80's. A dac for each channel pretty much guarantees there will be no delay.
 
rfbrw said:


Well, I've built a few and I own a few dacs and none of them have the delay. Come to think it, nos aside, I doubt there has been a cdp/dac built with the delay since the late 80's. A dac for each channel pretty much guarantees there will be no delay.
I've been playing around with some sound cards, and it seems if I do a loop back recording, some have a delay that match the sample rate timing, so that is why the question was raised. While I'm sure that in most systems, this small difference may not be audible, but as speakers become better, I would expect better image focus.

If a dac guarantees no delay, then if there is proper sample and holding of the data, the data timing would be perfect. Hmm.
 
I think my point is a bit different - let me explain (I hope.)

Looking at modern DACs one has a significant amount of logic. Have a look at high end DAC system - the PCM1704 and DF1706 and one sees a three chip solution. The DF1706 has separate left and right outputs, and the timing diagram shows them as simultaneously clocked out. A pair of PCM1704s suck on their individual streams. Seems proven - both channels in sync. But look back inside the DF1706 - in the gap between the I2S (or other bit serial input) and the outputs. The outputs are 8 times over-sampled. What is there to say that there is still not a time offset between them?

The input interleaves left and right, and a separate L/R signal provides the word clock. What will a designer do? Faced with the choice of either sending the first valid word to one interpolation filter, then clocking in the next word and sending that to the other one, letting the two filters cheerfully clock away - or to carefully add an extra latch in the way, and to clock both words in, and only then dispatch them to the two filters, what did he choose? It won't matter, post reset the filter pipelines will be zeroed and a few more zeroes input down one side won't make any operational difference.

Well who knows? And more to the point, no one should really care. It might simply depend upon what mood the designer was in when he got up that morning. It has no audible effect.

A telling thing about the DF1706 anyway - nowhere does the specification demand whether it is the left or right channel that is received first. There are a few possible implementations left open here - involving either synchronised or offset outputs. But the chip specification is moot on which one it is.

It is quite possible that the chip does indeed sychronise the channels - but it does not specifiy that it does, and it should not matter one way or the other.
 
Strewth, as you say down under. Only an academic would overthink something so simple. In matters not a jot what the filter does, all that matters is the dac. In the case of two mono dacs, the serial input register is loaded by the bit clock which can be a burst or continuous and on the falling edge of word clock or latch enable or whatever you choose to call it, a clock signal common to both dacs, serial to parallel conversion takes place initiating conversion.
The only difference with a 2ch dac is that the input register now has to accommodate two channels and after enough clock cycles to load both channels again there is a transition of word clock and conversion takes place.
I would have thought it was all obvious from the timing diagrams.
 
It is obvious from the timing diagrams - obvious that they make no claim one way or other about the existance of an offset betwen channels due to the interleaving of the channels in the I2S stream.

The PCM1704 DACs run at 8 times the sample rate, and indeed load on the edge of the oversampled word clock. This says nothing about whether the oversampled stream has an 8 sample offest delay betwen channels due to the implementation of the oversampling filter.

That was my point.
 
Agreed, but that same word clock edge is not the same word clock that is used to differentiate the L/R streams in the I2S feed - with a (DF1706 for example) digital filter in the way the word clock on the DACs is running 8 times faster. There is nothing in the specification of this particular digital filter that says that the two streams of over-sampled data are derived in the way you suggest. But nothing to say they aren't either.

If the designer of the filter decided to load both filters from a two word wide latch it would synchronise them, if he decided to load individual filters with a new value on the edge transition that denoted that particular channel's word clock, the two streams would have a half base sample period delay between them (four samples at the new over-sampled rate, not eight as I wrote above.) The oversampled output data samples would remain individually in sync, but the four sample offset between channels would remain. Which is the same problem as was original mentioned.

The point I'm trying to make is that there is no reason to worry about this. Sure, I would not in the least bit be surprised to find that the vast majority, if not all, digital filters carefully decided to clock in both words before simultaneously sending them to the filter pipelines. But it isn't part of the device specification. Because it really doesn't matter.
 
I imagine it isn't part of the spec because it was not foreseen that someone would speculate that things were done in such a cackhanded way especially when it takes as much effort to do it the wrong way as it does to do it the right way.
 
Francis_Vaughan said:
No amount of reasoning could convince them that a simple inter-channel delay is the equivalent of a 3.8mm static shift in the position of one speaker.
LOL!!!

I was going to read this whole thread to see what was actually going on until I read this sentence. I cannot believe I didn't think of that before I got this far into the thread. Thanks for the obvious!

David
 
Status
Not open for further replies.