John Curl's Blowtorch preamplifier part II

Status
Not open for further replies.
I would take care in 'believing' in the magic of dither. I will not say that it is not a help to make digital useful and 'tolerable' it appears to fool the test equipment, but not the human ear. At least that is my take on it.

Dither's purpose is definitely * not * to fool test equipment: it actually removes * signal-correlated * quantization noise from digitized signals at the expense of the increase in uncorrelated (ie random, ie white) noise floor, which you can easily get rid of by means of averaging (in test equipment, at least). Take a look at these pictures (taken from Microwave&RF's site):

An externally hosted image should be here but it was not working when we last tested it.


Which one would be - by eye :) - more 'tolerable' to the human ear?

L.
 
Member
Joined 2002
Paid Member
Coluke, good explanation.:)

I was under the impression that dither proposal was done by SY for the measuring process only (i.e. what it does and not how it sounds like).

The key issue of “properly bandlimited, properly dithered A/D/A conversion ” http://www.diyaudio.com/forums/analog-line-level/146693-john-curls-blowtorch-preamplifier-part-ii-1412.html#post2643045 is more of concern in measuring processes IMHO, as measurements tend to extend well beyond the audio range and noise shaping (pushing the noise to higher frequencies outside the desired spectrum) becomes a difficult task.

I haven’t yet looked at what happens within the pass band when the outside of band signals are folded back due to the brickwall LP filter inefficiencies (or is it something else?). I am talking here on measurements using typical amateur gear (sound card AD conversion).

Which are the - accessible to the user - conversion parameters that affect this folding back?


Chris, http://www.diyaudio.com/forums/analog-line-level/146693-john-curls-blowtorch-preamplifier-part-ii-1412.html#post2643259
This is very correct. Any noise added outside the quantisation process does not add to measurement sensitivity or resolution or DR or SNR. (*PS)

George

(*PS) Such noise though might do a similar trick in the FFT type processing in the sound perception phase within our ears or our brain, but that’s another topic
 
Last edited:
I would take care in 'believing' in the magic of dither. I will not say that it is not a help to make digital useful and 'tolerable' it appears to fool the test equipment, but not the human ear. At least that is my take on it.

Just curious- did you ever listen to Werner Ogiers's excellent dithering demonstration? I know it involves looking through the telescope, but trust me, lightning will not strike you. There's no magic involved, just basic math and physics, and the results speak for themselves- IF you bother to listen.
 
The ground disconnect switch is for "opening" the potentially shared connection when a scope is connected to the monitor output and there are potential ground loops.
I know.
If the signal never leaves the box except for a cable from the output to the input I don't see how a transformer would make a difference.
Examine the circuit in more detail.

The output jack has a signal terminal and a ground terminal. When a 1 volt signal is generated, the signal terminal will have 1 volt to the internal ground. The differential receiver input will see that 1 volt to it's own internal ground reference, so it will draw current from that signal line, and that current will go to chassis internal ground.

On the other hand, the shield of the cable under test, will see zero volts with respect to the differential receiver's ground reference. As such, the differential input it is connected to will also see zero volts, and there will be no current being drawn through the shield in direct opposition to the current being drawn through the core wire.

While the cable dielectric will indeed see the voltage differential, and it's capacitance will charge and discharge, the current through the core wire resulting from any loading at the receiver end will not be equal and opposite to the shield current. Indeed, the instrument is also NOT configured to even see or measure that charging current other than as an IR drop in the drive voltage at the output attenuator. Range changes will also impact that aspect as well.

In other words, the IC is being tested as a simple loop of wire with NO effective shield other than a faraday extension.

This is why the test setup is extremely sensitive to external fields as well as sensitive to the wire placement, as the test instrument has no ability to cancel any loop trapping generated voltage cause by the test cable in the environment. The setup design does not use the coax as a coax..

Please explain the uncontrolled current path from BNC out to differential input, I may be as dense as depleted uranium so I need help here. All I see is the source and the differential inputs. The shield on the coax is tied to the shield of the BNC and the minus terminal of the differential, the hot to the plus terminal. What is the source of the uncontrolled current?

As explained above..

A transformer will isolate at DC and have capacitive coupling at AC so Its no panacea. And transformers capable of -120 dB harmonics are hard to find.
Once the ground loop is broken by a transformer, the current in the core of the coax will go through it's resistive load at the receiver (+) input to the chassis ground connection, and then it will go through the (-) input's resistor and back through the shield to the output transformer. Instead of the current topology where the shield has NO current at all, and the return current flows through the internal chassis ground.

As for capacitive coupling, there are methods to lower that. Apparently AP has done so.

I have not attempted to duplicate John's cable distortion tests. I have too many other puzzles to pursue.
Duplication of a non-rigorous and flawed test methodology to obtain non repeatable results is indeed not worthy of the time nor effort.

Cheers, John
 
Last edited:
Dither, on the other hand, is added during the quantization process because it allows signals smaller than the smallest quantization step to be "kept" as information. Without dither smaller signals would be lost. With dither the smaller signals are kept, although they may be below the new noise level. Averaging over longer periods of time favors (repetitive) signal over (random, non-repetitive) noise.

To sum up, dither *makes possible* low signal level measurements, but sometimes at the expense of having to look for a longer time.
I recall the use of a triangular dither signal having a peak to peak value of 1 LSB added to the signal of interest. When this is done, the summed signal will always cross one of the transition levels. This results in a duty cycle modulation of the comparator output. A 50% duty cycle means the dc signal is exactly on one of the thresholds. 1% duty means it is almost exactly between transitions.
edit: had to fix the explanation, I messed it up a tad..:(

Nordmark actually measured the impact time dither had on lateralization resolution on human subjects. Attached is one of his graphs. Note the dither is actually in microseconds.

Cheers, John
 

Attachments

  • lateralization graph.jpg
    lateralization graph.jpg
    58 KB · Views: 213
Last edited:
I love it! Make a test, find harmonic distortion where it should not be, and the messenger is decried as a fool. Sound familiar, anyone?
About 1000 years ago, a supernova appeared in the heavens. However, NOBODY reported it, at least nobody in western civilization reported it. The Chinese noted it, but then they had a different set of rules to go by. What I see is what I report, nothing more, nothing less.
When it comes to dither, I think that it is a great help, BUT not the total answer to high fidelity reproduction. It is a 'tool', like Aphex is a 'tool', to attempt to optimize less than optimum audio reproduction. I think that dither 'hides' less than optimum signal capture, wonderfully, but is that all there is to create super fi? Apparently so, to some ears.
 
ITD ???

not too relevant as far as I can see - the ITD tests define "jitter" as the advance/delay applied to an entire bandpass filtered pulse, changing values only for the next pulse at up to high single digit hundreds of pulses per second

additive amplitude dither changes value randomly every sample, for the ms time scale pulses of the ITD tests there is no net shift from TPF amplitude dither
noise shaped additive amplitude dither has even less spectral energy sub/low KHz

in fact a later ITD paper adds ~ -60 dB amplitude noise to mask the possible time delay filtering amplitude/offset error correlations




as far as low level distortion - scientific instrument designers, grad students spending years on experimental apparatus for their research, detector teams at CERN, ect. have little agenda re what works, in physics many won't even have a very complete EE education so may not be "contaminated" by "conventional engineering" prejudices and will dig into corners of their electronic instrumentation's noise, distortion limits with the only goal being better data for their research
they publish their results, eventually the knowledge reaches even the EE textbooks, certainly the specialist works such as Ott, Brown, Morrison, journals such as IEEE Review of Scientific Instruments, Journal of Experimental Physics…
 
Last edited:
To quote you, "Duh!" There is no one "total answer." A microphone does a poor job of amplifying. Class A operation doesn't apply to speakers. But dither extends the resolution of digitized signals, and is no more of a "patch" than bias in analog tape. It has nothing whatever to do with Aphex. Your "observations" are stuck in 1978.

And as usual when trying to describe science history, you are incorrect. SN1006 was observed and noted in Egypt (then the center of Western science) and Europe. Not that any of this is relevant, but you do seem to like telling stories, and it's not nice to teach people incorrect things.
 
I love it! Make a test, find harmonic distortion where it should not be, and the messenger is decried as a fool. Sound familiar, anyone?

Stop playing the martyr John.

While it can be construed as "foolish" to embark on a long trek down a road based on the results of an ill conceived and poorly designed test setup, and "foolish" to explain the resultant output as a result of some ridiculous and contrived explanation which defies all scientific knowledge, and "foolish" to ignore others who have constantly detailed the test setup errors you have created.....I have not called you a fool.

In point of fact, I have detailed at length what your setup is doing, why it is doing it, how to get around it, and....why your test and results are actually of importance.

You have accidentally backed into a ground loop based issue, which is exactly what single ended input systems do as well.

So while there are no microdiodes, and you are not actually measuring some nefarious result of poor contacts in the milliohm range nor any other metal nonlinearity, the work is not wasted once a real understanding is there.

Cheers, John
 
not too relevant as far as I can see - the ITD tests define "jitter" as the advance/delay applied to an entire bandpass filtered pulse, changing values only for the next pulse at up to high single digit hundreds of pulses per second

Yo, dude!! He wrote about jitter, honestly I don't know how he obtained it.

I figured it was consistent with the discussion, so don't shoot me... :D

edit..oh sorry, I forgot the context.. I've read some research which seemed to indicate humans used zero crossing for timing information..adding a dither signal would cause jitter in the zero crossing..so I therefore considered Nordmark as consistent with the discussion..

as far as low level distortion - scientific instrument designers, grad students spending years on experimental apparatus for their research, detector teams at CERN, ect. have little agenda re what works, in physics many won't even have a very complete EE education so may not be "contaminated" by "conventional engineering" prejudices and will dig into corners of there electronic instrumentation's noise, distortion limits with the only goal being better data for their research
they publish their results, eventually the knowledge reaches even the EE textbooks, certainly the specialist works such as Ott, Brown, Morrison, journals such as IEEE Review of Scientific Instruments, Journal of experimental Physics…
Um, I'm not sure what to respond??

My point was, JC's "distortion" is an artifact of a bad test setup. And newer equipment goes lower yet sees nothing.

edit: ah, now having read scott, I understand your jist..sorry bout that. I concur. Even AP has gone significantly better in the, 30 years or so??

Cheers, John
 
Last edited:
Where oh where does the HARMONIC DISTORTION come from? Not EMI, not hum or coupling with other equipment, just plain old HARMONIC DISTORTION. IF the distortion is created by some mysterious path in the the test equipment, then why does an apparently similar cable, you know: Same length, same characteristic impedance, same geometry, etc, measure differently from another cable, even when put in the same layout and spacing in the test? Of course, sometimes this sort of test can be contaminated by airborne EMI, but so what? THAT can be ignored when evaluating the test, because it is almost impossible for that EMI to have the SAME frequencies as the HARMONICS. If they did, then a slight change in the fundamental would bring this out.
JN, as you once 'trashed' Dr. Hawksford's work, do you go after mine. However, I am not certain if you have EVER had any audio test equipment and are actually qualified to judge the work of others from your desk.
 
as far as low level distortion - scientific instrument designers, grad students spending years on experimental apparatus for their research, detector teams at CERN, ect. have little agenda re what works, in physics many won't even have a very complete EE education so may not be "contaminated" by "conventional engineering" prejudices and will dig into corners of their electronic instrumentation's noise, distortion limits with the only goal being better data for their research
they publish their results, eventually the knowledge reaches even the EE textbooks, certainly the specialist works such as Ott, Brown, Morrison, journals such as IEEE Review of Scientific Instruments, Journal of Experimental Physics…

This message never seems to get any response. I have worked with many customers in many fields from geoscience to medical and have never run across a single instance of any of these audio "mysteries".
 
Last edited:
Where oh where does the HARMONIC DISTORTION come from? Not EMI, not hum or coupling with other equipment, just plain old HARMONIC DISTORTION. IF the distortion is created by some mysterious path in the the test equipment, then why does an apparently similar cable, you know: Same length, same characteristic impedance, same geometry, etc, measure differently from another cable, even when put in the same layout and spacing in the test? Of course, sometimes this sort of test can be contaminated by airborne EMI, but so what? THAT can be ignored when evaluating the test, because it is almost impossible for that EMI to have the SAME frequencies as the HARMONICS. If they did, then a slight change in the fundamental would bring this out.

You obtain results which are not repeatable using equipment which is far more sensitive.

You're equipment's design flaw and test limitations have been pointed out for years now.

Your setup does not test a coaxial cable as a coaxial cable..the return current flows through your equipment chassis.

You need to look within the equipment to find where the harmonics are coming from, as that is where they are being generated. Not the wire.
JN, as you once 'trashed' Dr. Hawksford's work, do you go after mine.
I detailed at length the HUGE flaws in his payed for but not peer reviewed article. I specifically explained the theory flaws he wrote. And I was nice about it. Some of his peers really trashed him no holds barred..very ugly.

However, I am not certain if you have EVER had any audio test equipment and are actually qualified to judge the work of others from your desk.
You're not certain about much, are you.

Stay on topic, discuss the technical flaws that your test setup has.

Cheers, John
 
Thank you SY, I did not know that the Egyptians noted it somewhere. How about everyone else, why was it not in the 'chonicles' or even the oral history of all of Western society?

It was.

Remember what was going on in Europe at that time. There's a reason it was called the Dark Ages. Western Civ was centered in the Middle east and North Africa. So there were no "astronomers" in Northern Europe, nonetheless there were some recorded observations originating in what was later to become the Holy Roman Empire.
 
JCX, I think you have found an important 'clue'. You have to TRY things, have a goal to measure something, in this case. You have to try to find any flaws in your test set-up. However, you should NOT' throw the baby out with the bath water'. You want to find the SOURCE of what you find, and not bury it by accusing yourself of: 'delusion, hallucination, group hallucination, mass hallucination, mere coincidence, sheer coincidence or sloppy research'. '-)
 
Status
Not open for further replies.