Jitter? Non Issue or have we just given in?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
It's a USB DAC. I see no particular mechanism in the transport (a PC) likely to introduce low frequency jitter. If I stick a CD in the PC's CD ROM drive am I expecting a problem from low frequency jitter..? I would imagine that the PC would request chunks of data from the CD and stream them out at a crystal-locked rate. Where's the low frequency jitter coming from?
 
@Sonic

If we're talking about CD players not PCs, looking at the data sheet for a Philips CD7 CD processor and servo controller IC (which I really never wanted to do!) it controls the motor speed by basically feeding (a function of) the audio data FIFO level to the CD motor as a voltage. It's as crude and/or elegant as that! So the faster the FIFO is filling, the slower the motor runs and vice versa.

Independently of that, the data is streamed from the FIFO to the DAC at a crystal-locked rate. There is no low frequency jitter, in other words. You can forget the influence of the motor, bearings inertia etc. on sample timing, because those things have no link to the rate at which data is clocked out of the FIFO.
 
(I was discussing sample rate converters yesterday in another thread, and I was making the point that yes, they change the samples but in a mathematically almost perfect way).

Right, its worth mentioning that 'bit-perfect' is an obsession amongst certain audiophiles, just as 'ultra-low jitter' is. There's nothing particularly special about a sequence of numbers used to represent music - the theory says (I think) that there's an infinite number of sequences that could be used to describe the same music. If the phase of the original sampling clock was slightly changed, all the samples in the recording would be different.
 
Hi,

You may be interested in this link I posted earlier:

The D/A diaries: A personal memoir of engineering heartache and triumph

It describes the development of a PLL-based system which locks very quickly but then switches to a slowly adapting frequency correcting system.

I am familiar with this. It talks about SPACT for the BB/TI USB DAC Chips. This actually is better than the common SPDIF PLL's, accounting for the measurable and sonic superiority of these IC's when used for audio connections instead of SPDIF. However, BB failed miserably when they attempted to apply SPACT to SPDIF receivers. THeir current versions are back at classic PLL's.

However, no matter if you make a PLL Multispeed, or even adaptive speed or use two PLL's, it is a solution that has limited merit.

If instead you (for arguments sake) combine a VCXO and a DAC we have a very large number of discrete frequency steps available. We can now adjust the clock for the closest match with the actual frequency and leave it there, until our FIFO Buffer threatens to over- or under-flow. Then we would take one step into the opposite direction and wait this the buffer refills or re-empties.

This is basically the flow control on the DAC side.

Two issues, if the buffer is too large it will cause audible delays and if we have a very large difference between nominal and actual clock we may take so long to match the clocks that our buffer has already over/underflowed.

It clearly means we need a very precise and fine control over the clock to be able to approach the source clock very closely, so our buffer status changes very slowly and a buffer of reasonable size. We also need a mechanism that can "lock in" the clock very quickly no matter how big the difference between clock.

The general shift towards PC based playback and the availability of Asynchronous USB Audio systems makes this whole point moot of course, SPDIF will soon join the trashheap of history, together with CD.

Ciao T
 
The general shift towards PC based playback and the availability of Asynchronous USB Audio systems makes this whole point moot of course, SPDIF will soon join the trashheap of history, together with CD.

Ah, so there is a USB mode available that allows the DAC to slave the PC - I had wondered why there wasn't. Reading around, some people say that it isn't implemented properly in the standard Windows driver usbaudio.sys, but then you seem to be able to buy commercial asynchronous USB DACs that don't require any new software installation. Has anyone done a DIY asynchronous USB DAC?
 
Hi,

Ah, so there is a USB mode available that allows the DAC to slave the PC - I had wondered why there wasn't. Reading around, some people say that it isn't implemented properly in the standard Windows driver usbaudio.sys, but then you seem to be able to buy commercial asynchronous USB DACs that don't require any new software installation. Has anyone done a DIY asynchronous USB DAC?

Async USB exists in two styles, ones that do not USB Audio Device modes, these always need their own driver, and those that use USB Audio Device Class(es).

The USB Audio Class Devices CAN in principle be asynchronous (incidentally Firewire too), however not many devices implement asynchronous USB, though it is becoming much more common. The implementation is on the hardware side only. The same goes for Firewire, however currently no FW chip implementations I know or have tested support Async Mode.

The USB AUdio driver in Windows only supports USB Audio Class 1 devices up to 24Bit/96KHz, for USB Audio Class 2 devices a 3rd party (as in not M$) driver in necessary for windows (including 7).

Async USB for High End BTW was pioneered by Wavelength Audio's Gordon Rankin using the TI TAS1020. Nowadays other options exist.

Ciao T
 
Ok this is good! look at all these post's! So jitter IS very much a subject to talk about and audio manufacturers just aren't doing anything about it. or at least if they are, they aren't advertising it.

Looking at the use of an external DAC with a CD transport. I would assume the accuracy of the data being streamed out of the transport is important. the transport needs to have a low jitter rate. the DAC needs to have a low jitter rate as well for its internal clocks.

Now i have to imagine that a PC or especially a laptop as a source. connected either through a soundcard that has SPDIF or VIA USB just has to have LOADS of jitter? with all the other housekeeping, antivirus, background crap running on such a machine there just has to be a boat load of jitter right?

and now that the ipod/pad/droid/mp3 player type devices are so popular, many with digital outputs that are being docked to DAC's, what about jitter in those devices?

Why hasn't buffering/reclocking become more prevalent? surely these days with memory being so cheap, it would be easy to add a incoming data buffer stage and then reclock the data out of the buffer at a more precise rate. I would think that such things would be commonplace today?
 
It clearly means we need a very precise and fine control over the clock to be able to approach the source clock very closely, so our buffer status changes very slowly and a buffer of reasonable size. We also need a mechanism that can "lock in" the clock very quickly no matter how big the difference between clock.

If the DAC clock frequency can differ from the source clock frequency (D)PLL, the door is set wide open for jitter and interference to pass because bandwidth is non-zero.

Unless suitable (USB) isolator is used, ground loop is created that adds even more problems by flooding connected circuits with interference.

Also don't forget the SPDIF, USB and I2S signals, these directly feed interference into DAC circuits, even if isolator circuits are used. Source interference causes jitter on SPDIF, USB and I2S signals, this jitter can easily pass isolator circuits.

In the DAC, USB, SPDIF and I2S jitter can easily pollute power supplies and ground references.

This interference then introduces trigger uncertainty because ground reference and supply voltage are modulated with interference.

Trigger uncertainty then introduces extra jitter by manipulating the exact moment the flip flop or gate changes state. This enables source jitter and interference to seep through virtually any jitter attenuator circuit, simply because practical circuits and components are never perfect.

In short, it is virtually impossible to fully block source jitter, regardless of jitter attenuating strategy unless perfect components and circuits are used.

Therefore it would be best to use a digital audio source that not only offers lowest possible jitter, but also lowest possible interference on power supply and I2S output signals. This basically excludes computer-based digital audio sources and most CD-transports (servo interference).

Good concept is using a single, very low jitter masterclock and drive both DAC and low jitter / interference source with it. This would provide rock-solid time lock between the masterclock and all connected circuits.

I also had to find out the hard way that it's best to use one single clock for all circuit timing, if multiple non-synchronized clocks are used, these clocks and their jitter spectra will inter-modulate, causing a really big mess that needs to be cleaned up.

When using this "good" concept that simply eliminates most of the conventional source-DAC issues, it will still be extremely difficult maintaining low jitter and noise levels for plain 44.1/16 NOS. I spent over one year getting jitter amplitude low enough and jitter spectrum neutral enough using such concept. But the results are certainly worth the effort.

Attempting to achieve similar low jitter amplitude and neutral jitter spectrum, using an external DAC, and USB / SPDIF interface is pretty hopeless. I tried a few years, then had to give up on this.


Then there are practical hardware limitations that spoil the fun. Even DAC chips need logic building blocks like gates and flip-flops.

Simple D flip-flops used for on-chip latching, (synchronous) reclocking or (synchronous) dividers add extra jitter. Here are some examples that illustrate the problem:

CD4013 (CMOS), 325ps
74HCT74, 75ps
74LS74, 65ps
74F74, 26ps
NC7SZ175P6, 13ps
SN74AUC1G80, 9ps
NC7SV74K8X, 5ps
MC100EP52DTG, 1.6ps

1.6ps is about as low as it gets using (P)ECL. But most DAC chips are based CMOS logic, resulting in estimated jitter levels between approx. 5ps and 75ps.
 
A computer should not have loads of jitter at all. As far as I understand it, where the computer turns into an audio streamer occurs within the sound card itself or as a part of the PCI bus. Audio data is clocked and streamed out of a buffer at a fixed rate as dictated by the soundcards clocks. It is then the PCs job to keep this buffer filled with enough data so that you don't get drop outs where the buffer runs empty.
 
Hi John,

If the DAC clock frequency can differ from the source clock frequency (D)PLL, the door is set wide open for jitter and interference to pass because bandwidth is non-zero.

The door is not wide open.

In the system I outlined the short tern variations (jitter) are suppressed by a FIFO. The clock is adjusted in very fine steps. If the clock is on the step closest to the actual average of the source clock (in my own implementation we are talking factions of 1ppm), which is how it should be, the FIFO fill factor will vary very slowly.

Using FIFO Flow control the clock will vary by exactly that tiny step after a period of as long a time it takes to "traverse" the buffer. This essentially introduces a cyclic "jitter" at a frequency of maybe 0.0005Hz (in my case) with a peak level of 0.005PPM @192KHz sample rate.

Unless suitable (USB) isolator is used, ground loop is created that adds even more problems by flooding connected circuits with interference.

It is not necessary to use an USB Isolator, but we need to isolate somewhere.

Also don't forget the SPDIF, USB and I2S signals, these directly feed interference into DAC circuits, even if isolator circuits are used. Source interference causes jitter on SPDIF, USB and I2S signals, this jitter can easily pass isolator circuits.

Sure. Again, the first object is to block source jitter AFTER the electrical isolation, the second to avoid adding local jitter back after we have removed source jitter.

In the AMR CD-77 I used ECL circuitry with the TDA1541 to minimise these issues.

Trigger uncertainty then introduces extra jitter by manipulating the exact moment the flip flop or gate changes state. This enables source jitter and interference to seep through virtually any jitter attenuator circuit, simply because practical circuits and components are never perfect.

Any competent design will reduce this bleed through to tiny levels. Separate power supplies (all the way to electrostatic screens in the power transformer with competent ground routing can reduce these issues massively.

In short, it is virtually impossible to fully block source jitter, regardless of jitter attenuating strategy unless perfect components and circuits are used.

In short it is very possible to reduce the source jitter feed through to a much lower level than locally jitter, in fact to very low levels, low enough to call this "completely blocking". The system I describe can completely block several UI worth of jitter sufficiently to escape measurement with an AP2.

Therefore it would be best to use a digital audio source that not only offers lowest possible jitter, but also lowest possible interference on power supply and I2S output signals. This basically excludes computer-based digital audio sources and most CD-transports (servo interference).

It only excludes incompetently designed computer systems and incompetently designed CD-Transports. It also excludes incompetently designed solid state transports, as these too are subject to PSU interference etc.

Good concept is using a single, very low jitter masterclock and drive both DAC and low jitter / interference source with it. This would provide rock-solid time lock between the masterclock and all connected circuits.

Yes, the alternative is to use a low jitter adjustable clock, driven by MCU programming (e.g. no direct PLL/DDL/DLL) which sets the clock correctly on initial lock and only takes minimal action when there is a risk of buffer under or overflow. This system has one marked advantage, namely that it is applicable to ANY source and will lock out source jitter from any source, no matter how competent or not the design.

Issues like isolation of grounds, noise feedthrough etc. still need addressing, but that is not all that difficult, come on, even I can get it pretty much right...

Your concept stated above drove AMR's development early on, nearly a decade ago. My counter-suggested concept drives our development now, in order to remove the "single, integrated source" limitation.

You could say that from output of the memory buffer the system you descrive above (with an SD Card Player) are fundamentally identical techically, except for the adjustable clock. Where you take your data from the SD card, I have it dumped into my FIFO by whatever source happens to be connected.

When using this "good" concept that simply eliminates most of the conventional source-DAC issues, it will still be extremely difficult maintaining low jitter and noise levels for plain 44.1/16 NOS. I spent over one year getting jitter amplitude low enough and jitter spectrum neutral enough using such concept. But the results are certainly worth the effort.

Attempting to achieve similar low jitter amplitude and neutral jitter spectrum, using an external DAC, and USB / SPDIF interface is pretty hopeless. I tried a few years, then had to give up on this.

John, with due respect, I did read your TDA1541 Metathread (in part to see what you had come with), you have not once applied an approach to dealing with SPDIF (and other source) jitter that stood the slightest chance of sucess. When we started with AMR I found myself in the same situation, I jsut could NOT fix SPDIF jitter, so AMR's source was a CD Drive set up like you describe, together with the best USB input we could make (which in prctice was very good, but a little worse than the internal drive).

It took quite some time to see other options. A very dumbed down version (meant for ordinary audiophiles and probably still way too technical for them) of what we actually do is here:

http://www.amr-audio.co.uk/large_image/Tech%20Paper%201%20-%20Jitter.pdf


Then there are practical hardware limitations that spoil the fun. Even DAC chips need logic building blocks like gates and flip-flops.

Simple D flip-flops used for on-chip latching, (synchronous) reclocking or (synchronous) dividers add extra jitter. Here are some examples that illustrate the problem:

CD4013 (CMOS), 325ps
74HCT74, 75ps
74LS74, 65ps
74F74, 26ps
NC7SZ175P6, 13ps
SN74AUC1G80, 9ps
NC7SV74K8X, 5ps
MC100EP52DTG, 1.6ps

1.6ps is about as low as it gets using (P)ECL.

Yes, I normally throw a "ballpark" local jitter of 50-100pS around as the the limit current (CMOS based) systes can achieve. The very groundbounce in the IC's leadframes and related problems cause this.

But most DAC chips are based CMOS logic, resulting in estimated jitter levels between approx. 5ps and 75ps.

Given the complexity of even the most primitive DAC using CMOS logic I feel 5pS is exceedingly optimistic, only ECL Logic and a TDA1541 (which is basically non-standard ECL) can come close to that.

However, there is another issue. Just like "THD&N" is not useful to determine the audibility of distortion, so is RMS or Peak single number jitter not useful to determine jitter audibility or indeed even the subject impact.

So we may debate femto seconds of jitter or ppm levels of THD&N, but they merely relates to measurable phenomena. While there is not enough research to support it, I suspect jitter audibility has the same issues as that of HD/IMD. Certain, for want of a better word, pattern of jitter my be very inaudible, despite large amounts of jitter, yet others may be very objectionable, despite being relatively low.

Ciao T
 
Hi,

A computer should not have loads of jitter at all.

Why would you possibly say something like this? Clip a scope on the PSU lines in your PC...

As far as I understand it, where the computer turns into an audio streamer occurs within the sound card itself or as a part of the PCI bus. Audio data is clocked and streamed out of a buffer at a fixed rate as dictated by the soundcards clocks.

And the PSU of the soundcard comes from where?

It is then the PCs job to keep this buffer filled with enough data so that you don't get drop outs where the buffer runs empty.

This holds true only for systems that are external to the PC and do not draw their power from the PC (and are not subject to ground loops etc. when connected).

Ciao T
 
I wasn't discussing the effect of the other hardware systems within the PC, merely how the data is routed around. Within those confines there is no reason why a PC will 'add' jitter due to the way that it operates. Of course you will introduce pollution from sub systems, I figured that would be obvious, but I was addressing this specific point.

Now i have to imagine that a PC has to have LOADS of jitter? With all the other housekeeping, antivirus, background crap running on such a machine there just has to be a boat load of jitter right

This makes it sound like the other software systems running on a PC will somehow introduce jitter into the data stream simply by their presence. From a data streaming point of view, they will not. Now of course their presence might affect the power supply which in turn affects clock jitter, but that is a different thing entirely.
 
Hi,

This makes it sound like the other software systems running on a PC will somehow introduce jitter into the data stream simply by their presence. From a data streaming point of view, they will not. Now of course their presence might affect the power supply which in turn affects clock jitter, but that is a different thing entirely.

Well, as far as cause & effect are concerned the logic holds. THere is no attempt to describe the mechanism, but varying CPU load of a PC varies PSU Load, PSU Voltage and in the end clock frequency, thus altering and possibly increasing jitter.

So yes, software running on a PC can introduce jitter into a datastream by their presence on the PC, using a mechanism that involves the PSU, as the datastrem by itself naturally implies the presence of a clock, no clock, no stream.

Ciao T
 
So yes, software running on a PC can introduce jitter into a datastream by their presence on the PC, using a mechanism that involves the PSU, as the datastrem by itself naturally implies the presence of a clock, no clock, no stream.

Ciao T

As I said

Now of course their presence might affect the power supply which in turn affects clock jitter, but that is a different thing entirely.
 
Hi,

As I said

How is it that you claim that "if extra programs run extra jitter is created" is different to it's more qualified version that explains the mechanism?

The first version is an observation that it is actually to verify using measurements. Therefore it is true.

That the mechanism is not explained or evaluated does not make the observation invalid.

If we first have to fully explain each observation fully as to it's mechanism before publishing it science would progress very little if at all.

Dr. Roentgen published his results even though they could not really be fully explained. Should he have not done so until he not only had todays explanation (which is sadly incomplete) but have waited for the appearance of the Great Unified Theory before doing so? He would never have published anything, nor would have anyone else.

Ciao T
 
Hi,



How is it that you claim that "if extra programs run extra jitter is created" is different to it's more qualified version that explains the mechanism?

The first version is an observation that it is actually to verify using measurements. Therefore it is true.

That the mechanism is not explained or evaluated does not make the observation invalid.

If we first have to fully explain each observation fully as to it's mechanism before publishing it science would progress very little if at all.

Dr. Roentgen published his results even though they could not really be fully explained. Should he have not done so until he not only had todays explanation (which is sadly incomplete) but have waited for the appearance of the Great Unified Theory before doing so? He would never have published anything, nor would have anyone else.

Ciao T

That isn't my point. I stated that the running of said programs wouldn't create jitter simply by the software process. I then said this

Now of course their presence might affect the power supply which in turn affects clock jitter, but that is a different thing entirely.

Which in turn said that the running of said programs could influence the hardware/PSU and thus alter the jitter performance, except you seemed to ignore that and thought it was necessary to say this.

So yes, software running on a PC can introduce jitter into a datastream by their presence on the PC, using a mechanism that involves the PSU, as the datastrem by itself naturally implies the presence of a clock, no clock, no stream.

Which basically repeated what I had already said about the software affecting the hard ware.

How is it that you claim that "if extra programs run extra jitter is created" is different to it's more qualified version that explains the mechanism?

The first version is an observation that it is actually to verify using measurements. Therefore it is true.

I wasn't trying to say it is untrue, merely that the way the original person stated,

with all the other housekeeping, antivirus, background crap running on such a machine there just has to be a boat load of jitter right?

made it sound like (at least to me) that they thought the software, irrespective of if it affects the hard ware, would in same way introduce jitter. Like the CPU crunching numbers for a different program would somehow affect the jitter of your audio stream. Akin to if you're trying to play a computer game with the system priorities set to prio the video encoding that your also doing and it will therefore lower your fps and alter the video stream that you perceive. And that by running the encoding it could also upset the audio data stream in some way and increase jitter. This, as far as I am aware, it would not do, unless it introduced some sort of secondary mechanism, as in affecting the power supply lines.

On another note, how do you go about measuring jitter accurately? Is ARTAs 'jitter' measurement good enough? Or are there more things that you have to take into account?
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.