I don't believe cables make a difference, any input?

Status
Not open for further replies.
How important are the connectors and termination?

There's no such thing as bulk Cat 5E network cable. There's only Cat 5. The difference between a Cat 5 and Cat 5E network patch cable is the connectors.

To be classified "Cat 5" a cable has to be able to transmit a lossless signal of 100-megahertz a minimum of 100 metres. It has a bit-rate of 100 megs.

The Cat 5 patch network cable has since been superseded by the Cat 5E. E is for enhanced, like you didn't know. The Cat 5E cable will handle 350-MHz at a minimum of 350 metres. The bit-rate has increased ten-fold, from 100 megabit to one gigabit!

The new improved connector on the Cat 5E cable has better "near-end crosstalk," or NEXT. So what does this NEXT mean? Not much, really. Apparently the old connector had greater margin of error when terminated.

With Cat 6 and 7 we have 10GBASE-T. They need to run balanced to reach gigahertz.

Edit: Poobah, we posted at the same time. Now, please tell me the Cat X network patch cables are not science. 😀
 
It's not the soldering thats bad. It's the deformation of the transmission line that ruins the VSWR. Although overkill at our low frequency digital signals, careful attention to preserving the transmission line impedance is generally not done with "high end" S/PDIF cables. BTW 2.2 GHz is very low frequency for microwave. You can probably get away with a little bit of sloppiness there. We used to measure transmission lines, connectors, and board traces with TDR and vector network analyzers (Hp/Agilent) and you can see where the discontinuities occur. ...even with SMA connectors. What I'm talking about here understanding performance margins, quantifying results, and qualifying what "the best" or "the most accurate" is. If a manufacturer is going to charge 6 X what a cheapo version is, at least they could incorporate some sort of better practices than just gold plating a connector, adding teflon, or putting a nylon braid on the insulation. They should definitely not be worse, which many are. I wish S/PDIF was all 50 ohm. Then we could go to Pasternak or Gore and be done.
 
poobah said:
C'mon rdf... this is a cable thread... no science allowed!

Let's just stick to bickering and senseless thrashing of the believers...

🙂


Sounds like fun but I'm expected to design a remote broadcast studio today. You keep thrashing the believers, I'll pop by every now and then to hector the psuedo-science klunkers. 😀
 
mrshow4u said:
It's not the soldering thats bad. It's the deformation of the transmission line that ruins the VSWR. Although overkill at our low frequency digital signals, careful attention to preserving the transmission line impedance is generally not done with "high end" S/PDIF cables. BTW 2.2 GHz is very low frequency for microwave. You can probably get away with a little bit of sloppiness there. We used to measure transmission lines, connectors, and board traces with TDR and vector network analyzers (Hp/Agilent) and you can see where the discontinuities occur. ...even with SMA connectors. What I'm talking about here understanding performance margins, quantifying results, and qualifying what "the best" or "the most accurate" is. If a manufacturer is going to charge 6 X what a cheapo version is, at least they could incorporate some sort of better practices than just gold plating a connector, adding teflon, or putting a nylon braid on the insulation. They should definitely not be worse, which many are. I wish S/PDIF was all 50 ohm. Then we could go to Pasternak or Gore and be done.

True. What applies in the real world does not necessarily translate to the ultra low-bandwidth world of audio. Anything not counted in giga-something has to be considered low-bandwidth. Coax is noted for being poor at handling high frequencies, for example. Those high frequencies are, again, in the gigahertz.
 
That's a deal rdf,

though I should be cleaning my playpen. I have some fixtures to upgrade and I lost the stupid little $52 tool to remove connector pins from the housings. $52... it's so nice that Tyco bought out all the good companies... and then ruined them.

:devilr:
 
mrshow4u said:
BTW 2.2 GHz is very low frequency for microwave. You can probably get away with a little bit of sloppiness there. We used to measure transmission lines, connectors, and board traces with TDR and vector network analyzers (Hp/Agilent) and you can see where the discontinuities occur. ...even with SMA connectors.


As I understand it the 2 GHz band is a Canadian telephone trunking standard. The highest frequency hop of any of my systems, or my tenant's systems, is 15 GHz. We've been in the process of decommisioning 450 MHz hops all over the lower BC mainland. The standard band for broadcast industry microwave is 950 MHZ. 2.2 Ghz is not a low frequency, and RF connectors are designed to factor in the soldering process. Conductor is inserted into pin, solder applied through a hole in sleeve, excess filed away, pin with conductor inserted into dielectric base of shell.


"...careful attention to preserving the transmission line impedance is generally not done with "high end" S/PDIF cables."


You might be right, I really don't know. Starting with an RCA connector makes it a hurdle from the get-go.
 
Fifteen years ago, I changed a short S/PDIF cable for a longer one and it didn't sound as good. Not nearly as good. "Hey, it's only 1s and 0s," I thought, so why? Then I looked at it with broadcast eyes (used to be a video engineer). Measured to see if the source and load impedance matched the cable (they didn't). Checked on whether the DAC clock could reject jitter from the incoming signal (it couldn't). Given that, cables could make a difference. A short cable minimised the effects of mismatching.

PS Yes, of course I modified the sources to be 75 Ohm and the DAC to be 75 Ohm. And yes, it did sound better. Of course, I could have been fooling myself, basking in the knowledge that a square wave originating at the source was reaching the destination with minimum distortion.
 
Pasternak has 75 ohm cables and bnc as well.. This might be worth checking out for spdif..
Wow, I'm a little shocked at Pasternak's prices. $$$ I guess they seem cheap if work is paying for them. Gore would be way more expensive. There are some pre-fabbed ones through Markertek: http://www.markertek.com/Marketing.asp?target=CANARE. If you get the 75 ohm models, your source impedance is 75 ohms and the termination impedance is 75 ohms, you should have a good eye-pattern. At least if the driver is worth a darn.
 
Yes, eye pattern is the critical thing. Apropos of which, we had a new terrestrial broadcaster start up (somewhere around 1995), but my DAC wouldn't lock up properly to the S/PDIF output I'd added to my TV tuner (even though it was fine on other stations). I had a look at the eye pattern of the S/PDIF signal with my Tek 485 350MHz oscilloscope and rang up to let them know that they had a problem, but their engineer was extremely sceptical. We spent some time with him swapping bits of kit at his end until he became convinced that I genuinely could tell the difference between the faulty bit of kit and the (standby) kit.
 
Re: I don't beleive cables make a difference, any input?

Quote: The Paulinator
I am going to very mild-mannerdly make the statement that I have come to my own personal conclusion that speaker wire matters to the sound quality of your speakers about as much as a big pile of baked beans.

Yes you are dead right. If baked beans make you FEEEL GOOD then it makes a lot of difference!
Perceived sound quality depends on how you FEEL.
It is human nature that after we put in a lot of effort things DO seem better. It is optimistic enthusiasm remaining in us from childhood, when we CARED about everything!.

The ONLY ways loudspeaker leads can effect quality are if their series impedance (at some frequency or for some transient) is comparable to (say 1%) of the speaker impedance.

Thus the length of the cables is the most important factor and the way they are connected to any terminals or sockets.

The loudspeaker cone (which is NOT rigid enough to move as a whole) is driven by a coil of long thin (say 20 guage) wire with inductance of the coil and wire, and self-capacitance.
The loudspeaker impedance is say 4 ohms plus that due to the voltage induced in its coil by its motion in its magnetic field.

The amp designer, if he knows the loudspeaker and its cabinet, designs his output stage for OPTIMUM DAMPING (maybe slightly less than critical damping - no overshoot on a square wave.)
Of course this can only be done at one frequency at best, hence explaining all your results where thin or high resistance leads (or even bad contacts) work best.

What damping factor is best, and if negative feedback is used, how much, are all matters of opinion - so his may differ from yours!

If you think your way sounds better - then be HAPPY!
 
To anyone that believes that a $500 rca interconnect sounds better than a $25 interconnect is just brainwashed about the importance of fancy expensive cables! You all have heard the salemans pitch...about how an interconnect is the weakest link in the system and you need these $500/ft cables to achieve the best sound.
The only way for someone to hear a diffrence in sound quality between two interconnects is if the cable alters the signal in someway such as long cable length, poor shielding, or balancing.
Don't be fooled by all the hype about expensive cables! The manufactures of these cables are making a killing!
 
96tahoe said:
To anyone that believes that a $500 rca interconnect sounds better than a $25 interconnect is just brainwashed about the importance of fancy expensive cables! You all have heard the salemans pitch...about how an interconnect is the weakest link in the system and you need these $500/ft cables to achieve the best sound.
The only way for someone to hear a diffrence in sound quality between two interconnects is if the cable alters the signal in someway such as long cable length, poor shielding, or balancing.
Don't be fooled by all the hype about expensive cables! The manufactures of these cables are making a killing!
Not really true. I have used a few different interconnects, while some have differences that are hard do detect, there are some that make a quite noticeable improvement. We had a gathering where many people brought in different interconnects, speaker cables, and power cables.

Most people like to use cables to sort of tune their system. Most cables sound different most likely due to how it handles impedance mismatch induced energy reflection. However, as in any audio design, the more complicated the design, the more risk of losing detail and creating other negative effects.

If the audio world had a more strict interface requirements, then interconnect influence can be minimized.

I would categorize interconnect influence on sound along with components used in XOs. I do agree "expensive" is not always better.
 
Cal Weldon said:


Let's hope they offer less influence than that.
Actually it can be that significant. Extensively tried in speaker internal wiring, and BCS components. Right now Cat5E (taking 2 strands out of the 8 it comes in) is the baseline until I can find cable of better performance.

Interconnect difference was tried between the old MIT330, original MIT shotgun, Randall Reseach teflon insulated solid wire, Monster Cable, Magnan, and others that other people brought to compare. In this whole process, the MIT330 (3M) was what people thought to be the best, the next was MIT shotgun. The MIT shotgun did have a slight more glare in the sound than the MIT330 in this comparison.
 
Status
Not open for further replies.