The Black Hole......

The thing is that when a (subjective) difference in sound is perceived, it triggers the search for a cause.
When something is found, this will most likely be bombarded as the cause for the sound difference although the correlation seems mostly very far fetched.

Over the years I've seen so many completely different and mostly conflicting explanations why cable A sounds better as cable B, that I've lost track and don't even bother reading the papers.

Hans

Agreed with Hans. At the risk of sounding like I am pulling rank on anyone, working with very wideband distribution systems exposed me to many evils in signal transmission.

**Sorry, more RHC ahead**

Back in the days when we had wooden wheels on wagons before any of you whipper-snappers were born, I was the director of engineering at (the US's largest) cassette replication plant. Before you snort with derision and stop reading, you should know the recording was done on up to 80 multiple remote 'slaves' each the size of a small washing machine with a single master feeding them. Why is this relevant to the discussion? Because the bandwidth transmitted was 10 Hz to 2 MHz, and as we know, total system noise is proportional to bandwidth, and if there is a problem with signal transmission a wide-bandwidth system will certainly expose it.

With this in mind, imagine if you will a world where peace and good will are...(sorry wrong channel) imagine a 4-channel wideband (DC-2MHz) line amplifier/ UNBALANCED coax distribution system with many hundreds of feet of quad-shielded RG-6, BNC 'T's and potential points of ground system contamination. We also distributed the 8 MHz bias to all slaves in the same way. Now imagine if the requirement was to have a final S/N at the recording slave of better than 90 dB re +4 dBm. If any of you have experience sending an unbalanced signal over hill and dale, you will understand the challenge. Even the phone company ended up running balanced for all of the same factors we faced.

OK, so how does this relate to the discussion? As you can imagine we had many cases where audible effects or noises were detected at various points in the system. Things like odd image shifts, crackling at peaks, random dropping outs, and more subtle effects as well. Subtle noise floor artifacts were pinned to an inferior grade of name-brand RG-6, after which I settled on Alpha quad-shield RG-6 which never gave a problem. I have described the evils of BNC 'T's here before so I won't repeat...

My diagnostic was sending baseband audio and listening at various points in the system. Once cabling issues were settled, I found the more subtle effects were almost always due to line amplifier issues: resistors in the negative feedback divider, bad semis, noisy caps, build-out resistors (75ohm system) gone haywire, etc. This was easy to troubleshoot as the amps were in pluggable modules in the distribution amp rack. But if an amplifier was solid, I never heard an odd artifact.

However, there are a LOT of crappy cables on the market! It is important for the cable manufacturer to use virgin pellets in their extruders, when they include regrind, any manner of contaminant can end up in the dielectric including particles of shield conductors, jacket material, etc. If they purchase pellets from a second party (not the OEM) they never know what they are getting, and as someone who has purchased and molded millions of pounds of resin, I can tell you the inexpensive stuff can be real trash, but the savings can be hard to ignore for a low-margin product. Ask me how I know this...

Between this experience and many years of studio work I am still doing, I have concluded that when a difference is heard between cables, either the cable is crap (poor shielding or noisy dielectric), the connector is crap or dissimilar metal issues, or the driving amplifier is crap. With a stable, low-impedance line amp, I have never detected a difference in audio quality between otherwise well-constructed cables. If someone is trying to pin an audible difference to a cable, it must be tested with the connectors cut off and soldered into the test circuit, many 'cable' issues are connector or dissimilar metal problems.

If after reading this you decide I have always had bad hearing, you would be wrong, but I have no way to prove it from my retirement home...Nursey, can I have another sponge bath?...gotta go...

Cheers!
Howie
 
If someone is trying to pin an audible difference to a cable, it must be tested with the connectors cut off and soldered into the test circuit, many 'cable' issues are connector or dissimilar metal problems.

I thought the same, so I cut off the ends and used the exact same gold pin Neutrik XLR connectors on each cable. Each cable was also the same length. In that particular experiment, the output amp impedance was 600-ohms per XLR leg due to passive filtering after the I/V opamps in a dac. Load Z was 50k. Every different brand and model of cable sounded different including Belden and two different generations of Mogami Gold Star Quad.

EDIT: Some of the cables weren't too bad, the latest greatest Mogami coming in 2nd to Jam's custom cables. Some of the least good cables lost small details of sound, which was a problem for me since I wanted to do critical listening of some dac experiments.
 
Last edited:
I thought the same, so I cut off the ends and used the exact same gold pin Neutrik XLR connectors on each cable. Each cable was also the same length. In that particular experiment, the output amp impedance was 600-ohms per XLR leg due to passive filtering after the I/V opamps in a dac. Load Z was 50k. Every different brand and model of cable sounded different including Belden and two different generations of Mogami Gold Star Quad.

Mark,

600 output impedance is very high by modern standards. It worked well when all inputs and outputs were transformer isolated and by nature almost perfectly balanced with galvanic isolation from signal grounds, but it is not usable with electronic balancing, ground path contamination and so much more EMI that the AM band has been deemed unusable, and the FCC is about to allow it to transition to digital.

Try driving them with an amp with <100 ohm output on each leg and see what the difference is. Indeed if there is a difference between the 600 ohm and 100 ohm drive, the cable is a problem.

Cheers!
Howie
 
I thought the same, so I cut off the ends and used the exact same gold pin Neutrik XLR connectors on each cable. Each cable was also the same length. In that particular experiment, the output amp impedance was 600-ohms per XLR leg due to passive filtering after the I/V opamps in a dac. Load Z was 50k. Every different brand and model of cable sounded different including Belden and two different generations of Mogami Gold Star Quad.

EDIT: Some of the cables weren't too bad, the latest greatest Mogami coming in 2nd to Jam's custom cables. Some of the least good cables lost small details of sound, which was a problem for me since I wanted to do critical listening of some dac experiments.
Hi Mark.
What circuit is used (scheme diagram) in your XLR preamplifier and why do you think that the output amplifier had an impedance of exactly 600 ohms? Here is an example of a household quality scheme but not a professional one. See how they organized the exit. RCA and XLR. they just hung 75ohm and 600Ohm resistors at the output. As if it is very important and necessary ...

NAD S100 - Manual - Stereo Preamplifier - HiFi Engine
 
Last edited:
Try driving them with an amp with <100 ohm output on each leg and see what the difference is. Indeed if there is a difference between the 600 ohm and 100 ohm drive, the cable is a problem.

I would have to integrate a buffer opamp into the dac board, which would then sound like an opamp buffer. Not what I want to do. Otherwise there has to be cable between the dac output and the added amp you are suggesting.

What I was trying to do was connect the dac outputs using a 3-foot long cable to a HPA. It wasn't a long cable run. Jam's 3-foot cable worked fine. It was designed to sound like a zero-length cable and it comes quite close to that goal.
 
Hi Mark.
What circuit is used (scheme diagram) in your XLR preamplifier and why do you think that the output amplifier had an impedance of exactly 600 ohms?

The XLR load is a Neurohrome HP-1 headphone amp. It it rated at slightly below 50k input Z. The dac is AK4499 evaluation board and the manual for the board includes a schematic that shows the two-stage passive RC filters between the I/V opamp outputs and the XLR connectors. The resistors in each RC filter stage are 300-ohms. Two in series makes 600-ohms at audio frequencies. The caps are small values to filter out RF.
 
Two in series makes 600-ohms at audio frequencies. The caps are small values to filter out RF.
Manufacturers of audio equipment choose the impedance of 75 Ohms and 600 Ohms due to inertia, since it is assumed that the audio cables will also have a standard wave resistance of 75 Ohms or 600 Ohms, but I personally have never met manufacturers who would normalize the wave resistance of their analog audio cables. Naturally, if it is not a digital audio cable.

Are you sure that your DAC will not noticeably distort the signal when working with a load of 50 kΩ without a buffer on an operational amplifier?
 
Last edited:
Are you sure that your DAC will not noticeably distort the signal when working with a load of 50 kΩ without a buffer on an operational amplifier?

The dac will not distort because of 50k bridging load across the dac's 600-ohm output impedance. That's not an issue, and would not be an issue with a zero-length cable.

Now if you want to know what can cause audible distortion, that is a whole 'nother subject. Too much RF coming out of the dac can potentially cause distortion in the HPA. Low level RF from the dac might cause some effects in the cable, not sure. Wouldn't be the first dac that had some low level RF leakage though. Distortion created in other parts of dac processing can can make it though the system and be audible.
 
Last edited:
If there is a lot of high-frequency interference at the output of this DAC, additional filtering is possible using a specially designed audio cable.

Just a computer simulation of analog cascades on transistors shows that there is a certain optimal magnitude of the load at which the distortion is minimal. I very much doubt that for this DAC this value is exactly 50kOhm.
 
Last edited:
Howie,
I suspect I have used more cable than you. (JN probably wins this contest until we limit it to audio/video use.)

I have chatted with the plant engineers on more than one occasion when the cables weren't up to snuff.

My last communication with one of the manufacturers was that they had slipped production apparently due to one of their machines being down and sales was winging it as to the cause of delay and expected delivery date. My mention of the 3 million dollar day late penalty got things moved up!

I have also had cables flunk TDR measurements and had to explain the difference between a frequency sweep test and a TDR. In house frequency sweep is easy to do as both ends of the cable on the spool are easy to connect. In the field there can be a bit of a reach from one end of a field to the far side parking lot. A do have a spool that was clearly someones QC run as there is a BNC connector still on the spool inner end!

Now I have still been able to measure differences in simple audio interconnects. From the best to worst measured, in demonstrations at the time, some folks could hear a difference in the office sound system. Not a big deal but perceptible to some.

Now I have had TV Truck folks tell me my cables were bad, but when showing up with a Triax cable tester, it always turned out that it was their cables that were bad. For those who have never played with analog TV triax cables, the center connector pin penetration depth depended on the hand assembly skills of the assembler. Tooling to check is available and pretty much not used. A minor aside folks started making triax testers after that. (I used the panel mount connectors in my tester to remove the assembly variable.)

Using a TDR of course I could not only find defects in the cables, but also show the installing crew where along the cable they had screwed up the installation.

So I think we disagree on the small short interconnect cables making a difference. I will point out that in my large scale audio projects with an in use background noise level never lower than 60 dB "A" weighted and the maximum level 115 dB same scale and weighting that even the audio cable from what I suspect was the same arrogant source that you might have had a bit of bother with, was and still is just fine.

I should mention one time the electrical contractor pulling the cables decided to buy the cables themselves. The manufacturer being experts of course sold them the nice hand feel portable type of cables for permanent installation. Funny that nice soft feel cable jacket was a real bear to pull through conduit!
 
BTW, I should mention I just got in from eBay, a working Deforest Audion 4 pin base triode. It goes with my CK722s, Philmore Cat's Whisker still in the package with directions to make a crystal radio on the package, a TI 7400 with a 3 digit code date and a Gold lead uA709.
 
Howie,
I suspect I have used more cable than you. (JN probably wins this contest until we limit it to audio/video use.)...

I have no doubt!

...Now I have still been able to measure differences in simple audio interconnects. From the best to worst measured, in demonstrations at the time, some folks could hear a difference in the office sound system. Not a big deal but perceptible to some.

If you have published the results, the onus is on me to find them, but if not, can you summarize what the measurable differences were? Were they more than triboelectric charge generation, ESF shielding, or bulk parametrics like capacitance and inductance? These factors certainly can make a difference in situations where the driving amp was not low enough impedance and stable with the applied load, i.e. no ringing at load end with 10 KHz square wave. Interested minds want to know!

Cheers!
Howie
 
I should mention one time the electrical contractor pulling the cables decided to buy the cables themselves. The manufacturer being experts of course sold them the nice hand feel portable type of cables for permanent installation. Funny that nice soft feel cable jacket was a real bear to pull through conduit!

Oh man, I've been to that dance more than once. Even worse is trying to get it back OUT again after 20 years or more... 😡
 
My fastest scope has 9 ps rise time and that takes 50 GHz bandwidth.
So I could see it in principle, if I pay the $300 for the 2.4mm coax connector
and keep the input cable short, as in few cm.

The TDR must make do with 20 ps rise time but it shows minor impedance
variations in precision connectors without mercy.

Just hold your hand next to a microstrip and everything changes.
Speaker/head phone cable aberrations would not fit on the screen then.
That thing is completely disconnected from reality.

🙂 Agree
 
Manufacturers of audio equipment choose the impedance of 75 Ohms and 600 Ohms due to inertia, since it is assumed that the audio cables will also have a standard wave resistance of 75 Ohms or 600 Ohms, but I personally have never met manufacturers who would normalize the wave resistance of their analog audio cables. Naturally, if it is not a digital audio cable.
Are you sure that your DAC will not noticeably distort the signal when working with a load of 50 kΩ without a buffer on an operational amplifier?
While impedance matched 600 Ohm output stage >> 600 Ohm input stage were the standard broadcasting and pro audio interconnect system 3/4 of a century ago, the standard was borrowed from long distance telephone systems. But for digital S/PDIF the reason for 75 Ohm interconnects is just basic engineering optimization. For low level radio frequency signals, 75 Ohms works best.
 
While impedance matched 600 Ohm output stage >> 600 Ohm input stage were the standard broadcasting and pro audio interconnect system 3/4 of a century ago, the standard was borrowed from long distance telephone systems. But for digital S/PDIF the reason for 75 Ohm interconnects is just basic engineering optimization. For low level radio frequency signals, 75 Ohms works best.
Not this way. There are also cables with a wave impedance of 50 Ohms, 120 Ohms, 300 Ohms, and everything also works well on radio frequencies. This or that wave impedance is chosen for the smallest losses and maximum consistency with the load.
The RCA connector was not initially normalized to 75 ohms. It was an analog audio jack. Only recently have some companies upgraded their RCA connectors to 75 ohms. For example, German WBT. And this was due to the need to use a universal connector for transmitting digital signals over a 75 ohm coaxial cable.
 
Last edited:
Why assume S/PDIF is RCA? It started off as BNC which is the correct connector for the job.

Yawn, back to the same tired cable discussions. We tried taking the "red/white" interconnect that ships with every CD player and 4 RCA to BNC adaptors from Radio Shack and using them to connect an AP to a precision diff-amp demo card. Nada nothing above the AP's residual, same for a series of parallel and series 0805 SMT resistor experiments.