I don't believe cables make a difference, any input?

Status
Not open for further replies.
Funny I'll bet you won't find even a footnote about this in any nuclear or medical instrumentation journal even though the signals are often much lower than MC phono. Some 3D geo-science is at 20+bits and a massive number of channels. Bad image artifacts can cost millions.

Yes, there are a whole host of situations where it couldn't have helped but be noticed.

Just consider Audio Precision. The analog section of their System One and System Two uses output transformers with hundreds of feet of plain ol' ETP copper wire. Judging by John's measurements, it would have been simply impossible to have created such a low distortion measurement system.

se
 
Measuring the same cables at normal test levels of 1-3V tended to remove these differences. This was a surprise, and it implied that it MIGHT be micro diodes, made with dissimilar metals and impurities in the wire that created this extra distortion. In any case, the distortion appeared, or at least the equivalent to distortion in my test equipment appeared, with regularity.

John,

Although there are those who based on their "Faith" believe conductors are perfect, I have also measured deviation from perfect. I used an under 20mv twin tone as my excitation signal and measured distortion at -140dv or lower. (AX 11/09) I think with lower quality cables the results could be a bit worse.

I also heated a piece of copper wire until oxide formed and noted no change before or after. So I do not think micro-diodes are a contributor to this distortion.

Many of my early test were done on a soldered cable under test. No connectors were used.

When I used RCA connectors (Neutrik) I found it mandatory to use De-oxit, otherwise the connector's distortions were dominant.

I (OPINION) think there are several contributors to cable distortion at this low level.

The question at hand is are these distortions large enough to be heard on a high quality sound system.

Running a system at a low average voltage level such as 100mv with a high impedance load (100K) will lower your signal to noise & distortion ratios and make the cables contribution more significant.

A 16 bit CD is limited to under 100db s/n. 100mv is -20db. Not very good cables for the sake of argument may be 130db clean. So for that case the distortion would be 10db below the CD's range. Can that be heard? Are there sources better than a CD?

Note this is just a discussion of the effects of nonlinear resistance in audio interconnect cables causing distortion. There are other factors besides RLC, that may also affect the signals.

So it seems clear that from the measured cable distortion in a system running at reasonable voltages and impedances from a source such as CDs it probably is not possible to detect any difference between reasonable quality cables. However even in such a system connector oxidation may cause problems.

The question is left open if with better sources audio interconnect cable distortion is a factor.
 
Higher order odd distortion is very easy to hear, even at very low levels. It doesn't matter whether it is the connectors (clean and tight) magnetic hysteresis, dielectric problems, solder or press fit connection, etc. Distortion is distortion. Still, it is just a distortion that I measured, it could CERTAINLY effect other measurements, for xover distortion in power amps for example, IF I just used any old connecting cable, many loved by the hear no difference types, here.
 
Last edited:
It doesn't matter whether it is the connectors (clean and tight) magnetic hysteresis, dielectric problems, solder or press fit connection, etc. Distortion is distortion.

It does matter if one expects to actually LEARN something.

Making measurements and then mindlessly running around telling people that there are micro diodes in their wires does ABSOLUTELY NOTHING to enlighten anyone.

You do an awful lot of choppin' with that little axe of yours, but I rarely ever see any wood chips flyin'.

se
 
atta-diodes, maybe??

Funny I'll bet you won't find even a footnote about this in any nuclear or medical instrumentation journal even though the signals are often much lower than MC phono. Some 3D geo-science is at 20+bits and a massive number of channels. Bad image artifacts can cost millions.

nor in radio astronomy detectors, either...

one sign of advancing Alzheimers is the repitition of stories about events that transpired in the past, with no recollection of having mentioned such stories previously...
 
Last edited:
I use both myself. It wouldn't surprise me if Mogami sourced from Belden.
Don't expect a (rational or other) explanation from JC as to why Mogami would be only "a start".

Without any technical reason to prefer one over the other, I'd just as soon avoid giving business to the company that makes specious claims.

The fact that alot of their sections on wire are labeled "mystery of..." makes it obvious, that even if their cables are better, they have no clue why. They don't even say how theirs are different, much less better.

Maybe Miss Cleo knows...
 
Hi,

So it seems clear that from the measured cable distortion in a system running at reasonable voltages and impedances from a source such as CDs it probably is not possible to detect any difference between reasonable quality cables. However even in such a system connector oxidation may cause problems.

You'd be surprised to learn that at one point there at least one example of a wellknown supplier of high-end RCA connectors around learning from the guys in the field (you and me that is) that the steel locking barrels they used were causing audible distortion.

A good contact (not easy with your typical RCA connector) alone isn't enough, those plugs aren't always as innocent as they look.

Anyone remember the marketing buzz-words: OFC, LC-OFC?
When that settled down wires had to be as pure as possible and when that wasn't good enough anymore we looked for nine nines silver wire, pushed through the die slowly...
Some really dug for gold after that too.
Cryogenics entered the game and god only knows what else.

Twenty years later you'll still find amplifiers that on the surface seem to measure the same but still sound different from each other and so on.

I'd like to see the next big Hi-End Show using lamp cord exclusively. Yes, please.😀

Cheers, 😉
 
Last edited:
What is amazing to me, is that the 'experts' here don't even know much about Mogami. I started with Mogami, about 25 years ago, for Vendetta Research products. They were pretty darn good, pretty, well made, flexible, pure, etc. I graduated to Vandenhul mono crystal, and finally to BEAR pure silver wire, properly broken in, and directionalized. So there, novices in wire design. ;-)
 
Belden and Alpha are great products for industrial controls, etc. I have used it for about 40 years, seriously, and I usually make my multi-wire umbilical cords that go between the preamp and the power supply with Belden. DC-DC, no AC, except sometimes for a power cord.
 
You'd be surprised to learn that at one point there at least one example of a wellknown supplier of high-end RCA connectors around learning from the guys in the field (you and me that is) that the steel locking barrels they used were causing audible distortion.

And this was determined how exactly? The same way in which teleportation tweaks and the like are discovered to be "audible"?

se
 
Status
Not open for further replies.