SPDIF OUTPUT

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
There are only two places in a CD system where jitter actually matters. The first is the audio sampling at the recording end. We can't do anything about that, except hope that they did a good job. The second is the output transitions from the DAC. Much of what lies between simply has to get the bits across without errors, which it generally manages to do.

The DAC transition jitter depends partly on the DAC, partly on the SPDIF receiver, and partly on the CD player. The SPDIF cable will introduce some jitter, but the receiver PLL is specifically designed to deal with this. Cable jitter is mainly high frequency, and the PLL filters this out. Any low frequency jitter comes from the CD clock. Note that specifying jitter simply in terms of time doesn't tell you the whole story; the characteristics of the jitter are important too. This is just like noise, where you need to know the spectrum as well as the amplitude. Also, like noise modulation, some jitter can be signal-related. So 10ps jitter is not necessarily better than 25ps jitter, it all depends on what sort of jitter.

Some people work themselves (and others) up into a lather about RCA vs. BNC plugs etc. but in reality they introduce very little disturbance into a cable which probably isn't exactly 75 ohms anyway and is usually quite short. Far more important is the receiver PLL, and the CD clock. Properly synchronised reclocking can avoid this, but very few people seem to do this. For some reason which totally escapes me, asynchronous reclocking is popular even though this guarantees to introduce severe jitter!
 
if you watch the plot in my last message, you see that the major contributor of jitter is under 1 Khz offset.

At this frequency, all the receiver are transparent ...

I have measure my SPDIF out + the cable + my receiver, and the cable is not the cause to the raise of phase noise. At the end, the jitter rms 10 hz - 4 Mhz is about 2.2 ps. And the the major contributor to jitter is the low frenquency under

In french,
TVC Audio- Audio Numeric Performance

This is a very good perf. Better than the original from Wadia X32 and X64.

Rémi
 
Is that true? I would think that you'd hear the music at normal speed, but there would be long and frequent dropouts as the data buffers empty and refill.

SY, that would be true if you kept the sample rate at 44.1KHz but if you change the _rate_ you play back your samples you'll change the pitch exactly like you would on a turntable. Assuming ALL the clocks changed on the CD player it would do what SoNic_real_one claims. Where he is mistaken is about the 'superclock'. He is correct in that the data has time base errors - jitter- LOTS of it coming off the disc. When it's read out of the de-shufling RAM the only jitter remaining is whatever the clock has - which even for a mediocre player will be very low. This TBC concept was worked out 50 years ago for analog video - then digital TBCs on analog machines and then just digital machines.

 
This is what I have just completed the second edition of the RTO-1
 

Attachments

  • DSC07088.JPG
    DSC07088.JPG
    147.8 KB · Views: 298
  • DSC07068.JPG
    DSC07068.JPG
    135.5 KB · Views: 290
  • DSC07072.JPG
    DSC07072.JPG
    120.8 KB · Views: 250
...SoNic_real_one claims. Where he is mistaken is about the 'superclock'. He is correct in that the data has time base errors - jitter- LOTS of it coming off the disc. When it's read out of the de-shufling RAM the only jitter remaining is whatever the clock has - which even for a mediocre player will be very low.

They don't have enough RAM to compensate for all the jitter. Not for the low-frequency one, that requires larger memory. Sure, it cleans MAYBE the part that is over 1kHz. That is the part that the SPDIF receivers can "clean" too.
That is just enough (an acceptable quality) for most of the people.
To go beyond that you need some serios cache and the superclock won't do anythig to relief that jitter (because is not created by the clock, but by the optical reading/tracking mechanism).
 
Last edited:
Ultra-low jitter and it's effect on musical experience

Firstly, it's very easy to measure random jitter using phase noise spectra, as other contributors have mentioned. Take a look at the attached plot, which shows the phase noise from 0.1 Hz to 100 kHz, for the clock of the Audiophilleo1 USB-S/PDIF transport-processor. Phase noise is as many are aware, a frequency domain analysis, which is a generalized version of the time domain measurements that may be more familiar.

Jitter comes in many forms, all of which, unfortunately, add up to degrade the sonic qualities of a system. There's random jitter from a variety of sources that may be hard to control; there's deterministic jitter from power supplies, circuit components, etc., that, when highlighted by the phase noise plot, can often be suppressed.

Looking at the FFT spectrum (another frequency domain measurement paradigm), if one sees some spurs (artifacts) that are 120 Hz above and below a test stimulus, a power supply or mains noise may be the culprit.

The audibility of jitter is certainly easy to demonstrate, even in ordinary consumer sound systems. There are easily a dozen USB-S/PDIF interfaces on the market today, and their (time domain) jitter specs range from 2.5 picoseconds RMS from 10 Hz to 100 kHz to over one nanosecond. Guess which ones sound better?

With a familiar track, and a box full of these devices (have your friends come over with their favorite USB transport), what you'll hear is that with lower jitter, the realism and coherence increases. Harshness, hardness, dullness, lack of air all gradually decrease.

With the lower jitter/phase noise products, the results are breath-taking. Recording engineers often point to the very low phase noise regime, from, say, 0.1 Hz to 100 Hz, as being critical for realism, and this certainly is the case in my experience.

Even inexpensive phase noise analyzers can measure RMS random jitter down to the femtosecond range (thousandths of picoseconds). Reasonably-priced studio house clocks such as the Grimm CC-1 ($2495) will have jitter (over a specified range of offset from the nominal frequency) of say 500 femtoseconds. Using them with a high-quality DAC that has an external word clock input will give a major improvement, depending upon the PLL design (some PLLs are more equal than others :).
 

Attachments

  • Audiophilleo1 2.8 10m.gif
    Audiophilleo1 2.8 10m.gif
    52.8 KB · Views: 72
Is the data clocking out from the RAM controlled by a quartz clock? Is this clock adjusted to keep it in step with the transport, or is the transport adjusted to keep it roughly in step with the clock? I assume the latter - do you (SoNic) assume the former? If the RAM is not big enough then the player will keep missing or duplicating samples. I doubt if this happens very often. The low frequency jitter comes from the clock itself, either the circuit or the crystal. That is why a better clock can improve jitter. Whether this is worth doing depends on how good/bad the original clock is.
 
Last edited:
Let's say that you have a clock with ZERO jitter. Do you trully belive that a electro-mecanical spindle can rotate perfectly in sync with that clock - while continuously focusing, tracking the spiral, on a not-ideal optical disc?
No, the electronic that controlls the mecanism has a PLL loop that takes the incoming clock and adjusts the values of the motor speed to keep it "in the loop".
The end result (data) will carry the mechanical and electrical imperfections of the pick-up mechanism... as jitter. The "ultrastable" clock will be modulated by that mechanical feed-back loop.
For example: Look inside the SAA7210. At the input you will see the VCO and phase detector associated with that loop. Xtall is just the refference (setpoint) of the control loop. But that is NOT the output signal - PID controllers 101.
Sure, it has the external RAM to compensate for SOME of the variations, but the size is not adequate to eliminate completelly the low-freq deviations. It has only capability to address 4x16kbytes of data = 64 frames. Keeping the pointer in the middle, that is less than 0.7 msec, roughly equivalent of 1380Hz.
What I am saying is that using RAM (dual-port) is the only way to eliminate the jitter without creating false samples. But you need some serious RAM for that.
 
Last edited:
On the playback side, some very high-end DACs simply discard all incoming clock information from S/PDIF, put the digital music data into a big buffer (enough for, say, 500 ms) and clock it out using OXCO clocks to the DAC. In this approach, there's no need for a PLL.

In the case of CDs, the jitter is probably most likely related to the S/PDIF output buffers, any transformers, and the clock used to generate the clocking on the S/PDIF.
 
SoNic_real_one said:
Let's say that you have a clock with ZERO jitter. Do you trully belive that a electro-mecanical spindle can rotate perfectly in sync with that clock - while continuously focusing, tracking the spiral, on a not-ideal optical disc?
No, not at any particular time. Yes, absolutely, on average over a sufficiently long time. That is how a PLL works.

Are you really saying that the main clock is adjusted to cope with the transport? I assume not, in fact I doubt if the crystal could be adjusted enough to follow the motor. No, the motor follows the clock. Provided that the RAM is big enough and the PLL works properly, then the transport raw jitter does not appear. If the RAM is not big enough, or the PLL is poor, then the result is not low frequency jitter but dropped or duplicated samples. As I said, low frequency jitter comes from the clock not the transport.
 
Of course the brute output data rate will follow exactly the rotation of the disc! They TRY to adjust the motor to follor the reference (crystal) but the output will NOT be equal with the crystal.
BTW, they don't adjust the crystal, they adjust the internal VCO! That's how a PLL loop is working... As I said this is control loop 101. People tend to be confused about the "setpoint/reference" role in all this mechanism and how the controlled element (spindle motor) is working inside the loop.
800px-Pid-feedback-nct-int-correct.png
 
Last edited:
Here's a quote from the famous Bob Katz about jitter taken from here http://www.digido.com/audio-faq/j/jitter-better-sound.html
Here are some audible symptoms of jitter that allow us to determine that one source sounds "better" than another with a reasonable degree of scientific backing:

It is well known that jitter degrades stereo image, separation, depth, ambience, dynamic range.

Therefore, when during a listening comparison, comparing source A versus source B (and both have already been proved to be identical bitwise):

The source which exhibits greater stereo ambience and depth is the better one.

The source which exhibits more apparent dynamic range is the better one.

The source which is less edgy on the high end (most obvious sonic signature of signal correlated jitter) is the better one.
And a reply was posted:
The better one, and it is better, is also easier to listen to. . . less fatiguing. I would also add to this that the low end just "feels" bigger and more solid. This is perhaps a psychoacoustic affect more than a measurable one. It may be that the combination of a less edgy high end and greater depth and width makes the bass seem better.

All of this makes sense if thought of in terms of timing (that is what we're talking about isn't it ;-]). With minimal jitter nothing is smeared, a note and all its harmonics line up, the sound is more liquid (a term probably from the "audiophile" crowd but one which accurately describes the sound none the less), and images within the soundstage are clearly defined.
 
Last edited:
SoNic_real_one said:
They TRY to adjust the motor to follor the reference (crystal) but the output will NOT be equal with the crystal.
Of course, but it will be equal on average - that is what phase locking means. I find it hard to believe, as you seems to be saying, that the data is not clocked out of the RAM buffer by the crystal clock but by a VCO which is (poorly) locked to the crystal and also influenced by the motor. This would be an astonishingly poor design, so I don't believe this is what happens. One of us is confused. Maybe someone else could help clear up this matter?

The datasheet for the chip you mentioned seems to show the motor speed being adjusted from the buffer control logic, presumably to keep the buffer about half-full on average. This is perfectly sensible. There is probably also a VCO which locks to the raw datastream from the read head. These put the data in the buffer. The crystal clock takes data out of the buffer.

Diagrams of PID controllers add nothing to this discussion. I was working on PID software for power station control over 20 years ago, although most were just PI.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.