John Curl's Blowtorch preamplifier part II

Status
Not open for further replies.
Well, there is a certain curiosity. To see if there's any "there" there.

Here's an example for which I can muster up precisely zero, nada curiosity. Can you? (its from Stereophile's report on CES incidentally, and it concerns a USB cable - digital).

A key feature of the Diamond USB, which is held in the photo by Audioquest’s Andrew Kissinger, is the Audioquest DBS (dielectric bias system). Invented and patented by Richard Vandersteen, with the cable version co-patented by Vandersteen and Audioquest’s Bill Low, the DBS creates an electrostatic field that saturates and polarizes the molecules of the insulation to minimize energy storage in the dielectric. The result is claimed to be much greater dynamic range, lower background noise, and reduced phase distortion.

Steve Silberman, VP of Marketing, explained that all insulators have capacitance. Energy from the conductor enters the insulation and needs to discharge. The DBS’ electrostatic field lowers the discharge, which in turn lowers the amount of phase distortion and makes for a cleaner signal.

In a very short demo, Silberman compared music through a stock USB cable that came with his printer to music through the Diamond. Using the new Arcam R asynchronous USB DAC, Arcam AVR 600 receiver, AQ Niagra interconnects ($1600/1m pair), AQ Redwood speaker cables ($2300/3ft pair), and Vandersteen 2Ce 30th anniversary edition speakers, the difference in transparency and color was striking.
:devily:
 
It seems to me that the first thing that happens in a digital circuit is that the signal is reshaped by a Schmidt trigger and reclocked in a buffer register. That's one of the beauties of digital circuits, they either work perfectly or they don't work at all. Unlike analog circuits, there are no in-betweens. The high end aftermarket consumer audio cottage industry makes a lot of money from people who don't know that.
 
To me, this ('cable harmonic distortion')is completely wrong track. Why do not we rather speak about EMI/RFI pickup into various cables, loop induced interference voltage - dependent on audio instruments and audio system configuration. It is easy to measure these issues, and any system gives different results. This is not discussed, but strange 'distortions', non-existent or being somewhere at -300dB, are discussed.


You are right on track with this, cable function in environment(s) versus simply measuring the cable.

I would think mesauring audio cable in isolation is of limited value. Why don't we measure test leads, connectors and AC line "dirt" to determine what their contribution is to "cable harmonic distortion"?
 
You're not really up to speed with contemporary digital interfaces then. The clock must be extracted from the data with both USB and SPDIF - this makes the interface immune to clock skew, a particularly insidious problem with parallel-clocked interfaces. So no reclocking 'in a buffer register' before the clock itself has been regenerated with a PLL.
 
You're not really up to speed with contemporary digital interfaces then. The clock must be extracted from the data with both USB and SPDIF - this makes the interface immune to clock skew, a particularly insidious problem with parallel-clocked interfaces. So no reclocking 'in a buffer register' before the clock itself has been regenerated with a PLL.

Bob Gendron drew spokes on a CD and they showed up in the clock spectrum on a cheap CD player (using a very expensive Aglient jitter analyser). I think it ended up in some AES presentation. Not to judge weather it matters or not, but I saw it and the experiment was valid.
 
As jcx points out, the time domain and freq domain are duals, provided we accept that the FFT's output is complex. However the FFT is normally presented as magnitude only, so some information is missing. The windowing function employed trades off between dynamic range and frequency resolution, so there's also the potential to miss something there.

The trend towards greater and greater frequency resolution in FFTs (and more and more time averaging) means time resolution suffers. So noise could be coming and going within the time window of interest (audible noise modulation) and this would be totally missed.

I did not mean this as a trick question. The foundation of Fouier's theorem is that any periodic wave form can be considered as a series of sine waves.

There are two types of problems with this approach. Waveforms such as noise are not periodic. Secondly there are periodic waveforms that are not properly sampled in normal practice.

When I look at FFT displayed data I often see the "DC" or lowest frequency "bucket" changes on repeated tests of the same experiment. This can be noise, actual DC change, or partial sampling of a very low frequency waveform.

So the question is are there other cases where the theorem is not properly applied?
 
You're not really up to speed with contemporary digital interfaces then. The clock must be extracted from the data with both USB and SPDIF - this makes the interface immune to clock skew, a particularly insidious problem with parallel-clocked interfaces. So no reclocking 'in a buffer register' before the clock itself has been regenerated with a PLL.

Oh no, not again. I hear the digital jitter boogeyman knocking on the door already. The perfect fix for all that ails mp3.
 
There are two types of problems with this approach. Waveforms such as noise are not periodic. Secondly there are periodic waveforms that are not properly sampled in normal practice.

Any waveform can be formally made periodic- even if it's random or quasi-random, mathematically it can be treated as periodic with a period equal to the sampling time.

And as long as one respects the Nyquist limit (i.e., bandlimit to fs/2), the second point is moot.

As for phase, we're talking about electronic devices- the phase is always the derivative of the magnitude unless you deliberately build in frequency-selective delay lines.
 
It isn't clear to me what these graphs are supposed to show. If this suggests the cable's response including spurious generated noise, it is obviously flawed data. I can't speak for the other cables but insofar as the RS cable is concerned, the apparent falloff of output starting at around 19 khz and falling at around 10db per decade can't be right. The proof was the insertion of the cable in the video signal path between a VCR output and a high quality NTSC TV input. As I stated, I compared the TV set's own tuner on the same channel as the VCR's tuner using a cable feed. This test was similar to a shunt test. Not only would there have been serious degradation of the video signal quality by the RS cable, the color burst signal wouldn't even have shown up, it would have been too degraded and weak to have worked at all. You can easily verify this yourself by repeating my test.

Also, the 15db spike at 16 khz and the noise between 1 and 2 Khz would be clearly audible if the cable were used between a turntable output and the high gain magnetic phono input when a record wasn't being played. In addition to the general signal boost of the magnetic phono preamp to bring it up to line level, there is a 15 to 20 db additional boost at 16 khz as the result of RIAA equalization.

The similarity of the three test results in noise between 1 and 2 khz, the 16 khz spike, and the high end falloff strongly suggests that the equipment or test design is defective.

The data is shown through a 1,000 cycle notch filter. The level is reduced by at least 100db before it goes to the FFT analyzer. The roll off is an artifact of the test equipment.

The item of interest is the harmonics of the test signal are different for different cables. The 15750 spike is another test equipment artifact.

That has all been mentioned before.

The valid question raised is are we seeing repeatable results and then if we are, is it noise, connector issues, or something else. Is an audio interconnect treated as a system or a piece of wire? Some treat it as just wire which I think leaves out other causes of what may be a problem.

For example dirty connectors are a well known and ofter encountered problem.

The micro diode issue I think has been retired.
 
Administrator
Joined 2004
Paid Member
In addition to the general signal boost of the magnetic phono preamp to bring it up to line level, there is a 15 to 20 db additional boost at 16 khz as the result of RIAA equalization.

I believe you have that exactly backwards. In playback there should be about 18dB attenuation at 16KHz. So that spike would be pushed way down relative to 1Khz.

Just for the record. (excuse the pun).
 
Member
Joined 2004
Paid Member
Here's an example for which I can muster up precisely zero, nada curiosity. Can you? (its from Stereophile's report on CES incidentally, and it concerns a USB cable - digital).
"Invented and patented by Richard Vandersteen, with the cable version co-patented by Vandersteen and Audioquest’s Bill Low, the DBS creates an electrostatic field that saturates and polarizes the molecules of the insulation to minimize energy storage in the dielectric. The result is claimed to be much greater dynamic range, lower background noise, and reduced phase distortion."

:devily:



I find it fascinating that a patent I was issued in 1994, 5,307,416, describes this technique. The merits of it and its audibility are a separate issue. However, believing that a digital link won't be influenced by various external effects is naive. If it is self clocking it can be very affected by cables issues. The second question is whether the jitter and noise will be audible. Again the issue has not been fully resolved but its easy to see that a digital link is a clear path for high frequency noise to enter the analog stages with possible effects.

I have looked for distortion in cables with a -170 dB floor and found nothing. However this may be the wrong way to look.
 
Any waveform can be formally made periodic- even if it's random or quasi-random, mathematically it can be treated as periodic with a period equal to the sampling time.

And as long as one respects the Nyquist limit (i.e., bandlimit to fs/2), the second point is moot.

As for phase, we're talking about electronic devices- the phase is always the derivative of the magnitude unless you deliberately build in frequency-selective delay lines.

SY

The Nyquist limit is certainly valid, but it must be combined with number of samples. I can do a 256 point transform at 1ghz and not see any audio. I had mentioned AC power line variations caused by motor loading. These all would show up as DC noise unless you went looking for it.

Averaging also reduces noise, but the noise may actually be the data!

As to frequency selective delay lines, these are actually starting to show up in new amplifier compensation schemes derived from simulated designs. reality will only intrude when actual construction is tried, or a different simulation engine.
 
Averaging also reduces noise, but the noise may actually be the data!

That's a reason I presented some of the Bybee data with no signal averaging. Not that it made any difference... What needs to be distinguished is source noise versus detector noise.

it must be combined with number of samples.

Thus the first sentence of my post that you quoted. :D
 
Status
Not open for further replies.