RF & Audio

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I must admit I found all these jitter numbers rather too small to believe. Then I did a rough back-of-envelope calculation and found that for 16 bits a wild estimate of allowable jitter is 350ps - much smaller than I expected! I now find the 120ps quoted above to be believable. However, this does not necessarily mean that much smaller numbers are needed for more bits, as finer resolution itself might not be audible. It may be that masking rescues the situation, so somewhat higher jitters are normally OK. I have learnt something today!
 
I was thinking of jitter on the DAC output transitions, without being concerned about where they arise from. Assuming a normal DAC, jitter gives rise to an extra signal consisting of pulses which are equal in amplitude to the differential of the audio (or its inverse), with pulse width either random or signal-related. Note that even random width pulses still have their amplitude related to the signal, so random jitter becomes signal-related at the output of the DAC.

I assume that digital receiver chip designers generally know what they are doing, so jitter on the SPDIF link is much less important (yet this is what people often seem to get excited about, perhaps because it is easier to see?). Link-induced jitter tends to be high frequency, which the receiver PLL can filter out. Transmitter jitter of course depends on the quality of a crystal oscillator.
 
DF, be careful not to conflate (as many of the posters do) jitter at the DAC clock pin with jitter in other places in the circuit (like the data input), which is what the multi-nanosecond figures refer to. The jitter requirements at the DAC are quite stringent, as you have calculated!
The multi-nanosecond figures that you refer to from that study are quiet meaningless which you would realise if you read it with any critical faculties whatsoever!
 
I must admit I found all these jitter numbers rather too small to believe. Then I did a rough back-of-envelope calculation and found that for 16 bits a wild estimate of allowable jitter is 350ps - much smaller than I expected!

Why would the jitter numbers be a function of the bit-width of the system? Certainly Hawksford makes a stab based on the quantisation error of the converters, but to me this is a red-herring. Are you doing the same? To assume (for example) that the timing error should be lower than 1LSB sampling error at 20kHz is to also swallow that there's no resolution of the system below the LSB - that any signal whose amplitude is below the LSB is by definition inaudible. I myself find this assumption highly questionable when the system is competently dithered.
 
I assume that digital receiver chip designers generally know what they are doing, so jitter on the SPDIF link is much less important (yet this is what people often seem to get excited about, perhaps because it is easier to see?).

This could well be. It's also difficult for a consumer to determine if his DAC limitations are with the clock signal, and even harder to go in and do something about it. But transmitted jitter, well, that's just (literally) a black box that anyone can buy, irrespective of whether it has any effect on the recovered analog signal. Thus, perfect for audiophile attention (analogous to "designer" power cords).
 
It was 1ns (1000ps). LOL

Yeah? Where does it say that?

I'll tell you where - NOWHERE.

Why don't you people READ the study? You're going to use what Telstar says to run me down, but you haven't even bothered to check it's accuracy. Where he got the information from, I have no idea, but it wasn't from the study document, and I, for one, am not interested in hearsay.

All the rest of the stuff you've got is guesswork. These people conducted tests. These tests showed a detection threshold of ~ 500nS. None of the participants could detect jitter of 250nS.

Oooh, I wouldn't trust that study. No, YOU wouldn't, because you're not interested in evidence, you're interested only in riding your hobbyhorse.

You're all running me down for accepting the evidence. What? I should instead take the word of somebody whose designs extend only to DC, and in that case only as far as wiring in a battery?

This study was undertaken by 7 men from respected Japanese institutions ranging from universities to broadcasting corporations, I think it unlikely that between them they would fail to discover a major flaw in their test regime. This study is based on evidence of what people can actually hear.

Now I will say this, not for the first time, I could be wrong about this, but not on the basis of any evidence you have provided so far. It's all just a load of audiophoolery attempting to exploit a grey area which arises from the impossibility of demonstrating that something doesn't exist.

This is anyway nothing to do with the topic of the thread, which is RF & Audio, not Jitter. If you've got anything new to say, then out with it, otherwise pipe down. I don't appreciate having to re-read that document numerous times to discover that somebody is trying to discredit it with a lie.

w
 
Waki, you cited the paper & now you claim it may be a load of audiophoolery. Really? please tell me what the base level of jitter inherent in the system was? 1ps? 1ns, 100ms? Any idea?

What equipment (DAC/amp/speakers) did they use for the listening tests - a PC souncard & internal speakers, boombox, dCs? Any idea?
 
base level of jitter inherent in the system
I'm not sure this is a meaningful concept. The only jitter which really matters is the jitter on the data transitions at the DAC output. This in turn is likely to depend in a non-trivial way on jitter in the data feed to the DAC or other earlier stages. For example, high frequency jitter earlier on can be reduced by a good PLL, so is not a problem unless really excessive.

I agree that it may be naive to equate jitter error with quantisation error. It could be better or worse. That is why my estimate of 350ps for 16 bits was hedged about with words like "rough" and "back-of-envelope"; these were intended to sound a warning.
 
I'm not sure this is a meaningful concept. The only jitter which really matters is the jitter on the data transitions at the DAC output. This in turn is likely to depend in a non-trivial way on jitter in the data feed to the DAC or other earlier stages. For example, high frequency jitter earlier on can be reduced by a good PLL, so is not a problem unless really excessive.
"A good PLL" - what PLL is used in the test? What are it's transfer characteristics? Does the DAC add it's own level of jitter to the output? What is the ancilliary equipment used to evaluate the listening test? This paper is so flawed it's laughable that Waki would even cite it - he being a high speed digital designer & everything :)
 
You are making my point for me. There is no inherent base level of jitter. Each point in the system has its own level of jitter. Provided data integrity is maintained, the only jitter which actually matters is the jitter at the DAC output.

I was not citing any paper. There seems to be more heat than light in this thread at present. Can we all calm down, and stop hurling insults and quoting gurus?
 
Just a thought, as I see digital aspects in audio following the same path as analogue, with pet theories that dont follow good engineering practice, so...
There is a wealth of information out there on signal integrity, RF shielding etc, insted of trying to cure the symtoms (ie using mismatched cables to transfer digital etc), why not use good engineering practice to avoid the problems in the first place. If I was going to mod my SPDIF interface I would replace the RCA connectors with good quality 75ohm BNC or similar at either end, use a good quality 75 ohm cable, that will solve one of the biggest problems, impedance mis-matches, and improve the quality of the signal going down the wire!!
Even better, I would put my DAC next to the source and do away with the cable altogether, better signal integrity (less jitter, less ISI and any other SI problem that you can think of) and less RF pick up.
And slower rise times, but of course thats only suggested by people like Howard Johnson and Eric Bogatin.
Again a lot of papers cited are from 17+ years ago, a 20-30MHz clock was considered fast back then, now we run DDR memory with 300+MHz clocks. Digital design has moved forward, I would have thought there were more modern papers on the digital aspects of Audio?
 
Just a thought, as I see digital aspects in audio following the same path as analogue, with pet theories that dont follow good engineering practice, so...
Sorry? Where do you get this from - please explain?
What JosephK is investigating is the inter symbol interference that gives rise to jitter on slower speed SPDIF links i.e the SPIDF recovered clock is modulated & this modulation is correlated to the data signal. Which bit of this is not good engineering practice?

As I said maybe this should be hived off to a separate thread a sit is beginning to stray from the topic.
 
ISI is covered under signal Integrity, and if you are getting ISI on the relatively slow SPDIF interface you are doing somthing very very wrong with the interconnections.
I think maybe you mean deterministic jitter of a data depenancy nature.
If you have ISI then you need to sort out the interface to remove the ISI, this is caused quite often by bandwith limiting, and can be seen quite clearly as the back end of the pulse extending, but only seen it on quite high speed signals.
Again I would suggest that getting the cable and connector impedances right would get rid of a lot of the problems, and making the circuitry immune to RF intereference would pay dividends. At the end of the day one of the main points of a digital interface for transferring data is its immunity to noise (unlike analogue), so as long as the bit pattern can be re-created exactly from one device to another you have no problems (bits is bits, I'm afraid to tell you, some waveforms look like dogs when they get to the reciever, but as long as the timing buget is met, the signals are monotonic in the rise or fall reqion your pretty certain to be able to recreate exactly the data, the fact I am typing this on a PC proves the point that), and thats where signal integrity comes in, and if we can get a DDR or gigabit ethernet interface to work then getting the low speeds of SPDIF to work perfectly should be no problem. Minimise all the interconnect effects, then you can concentrate on reducing any other detremential effects, such as variations in the crytals frequency, other interference sources, power integrity etc etc.
Get the basics right, the rest will follow, insted of trying to band aid the basics when they are wrong.
As to explanation to some wierd myths, I have seen somewhere the comment that making a cable longer will mitigate reflection effects! I will try and find the link at some point.
So back to RF, the best way is to keep it out of a system, and after that controlling it if it does get in, by good PCB layout (also the best waqy of stopping it being created in the first place).
Have Fun
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.