I don't believe cables make a difference, any input?

Status
Not open for further replies.
Jakob2 said:
While this seems to be reasonable, it is apparently quite often wrong. It is quite unlikely to measure two interconnects with different construction and not to find a difference.
The electrical difference test was at first proposed by Baxandall and Hafler and a modern device for this test would be a decent A/D-converter (or soundcard) combined with Bwaslos diffmaker software (free for use and download at the liberty instruments website) .




I was measuring the difference between 2 interconnects and it's effect on the acoustical output of the speaker (single driver in this case). While we all talking about audible differences in cables and the subject of golden ear was brought up, the measuring microphone is much more neutral and sensitive and it wasn't able to pick either lower distortion or a difference in frequency response.
Measurements were done with Sound Easy , Behringer microphone, DUT was on the infinite baffle. :apathic:
 
Alan Hope said:
Presumably, if the 2 signals - one inverted - don't cancel out to null, then there is a difference. And again this thread is over - the subjectivists have won! Because the difference is demonstrably there and a good ear just might be able to detect it.

Not a chance. There are plenty of differences that are easily demonstrable with instruments, but you can't hear 'em. For example 0.1% THD, or 0.1dB SPL difference, or phase differences, the list goes on. Piece of pie for instruments, mission impossible for the poor ol' limited ear.

This thread goes on and on because people flatly refuse to believe proven facts about (1) ear's limits, (2) audibility of cables. Why? Because it contradicts their personal (sighted) listening experiences, which they flatly refuse to *dis*believe. The topic has been done and dusted, thoroughly and properly with valid experimental procedure and analysis, but still the OP wants to validate it personally. Fine, go ahead. If your findings refute previous findings, great, but they will need to be validated by replicating your test procedure using independent populations of testers and subjects. But, oh dear, that's already been done. And the finding was *cables can't be heard*.
:smash:
 
@ R-Carpenter,

it was just the argument that didn´t sound right.
In fact amplifiers, preamplifiers and most sources do give a duck, but the question is if speakers and listeners do.... :)

The beauty of the difference test and the diffmaker software lies in being not restricted to test signals. Just use your music samples that would be used during a normal listening session.

So, may i suggest to repeat your test with some musik? Load the results into diffmaker and see if there is some difference to listen to.

Originally posted by R-Carpenter,

....microphone is much more neutral and sensitive and it wasn't able to pick either lower distortion or a difference in frequency response.

The microphone is just one part in comparision to our hearing ability. It would be the equivalent to our physical ear (just a rough zero order approximation), but our hearing is combined out of ear and brain.
So the equivalent to our brain in your setup would be the method to analyse the microphone output.
Looking for differences in amplitude frequency wise and distorsion (to what degree?) might be not enough to mimic our brain.
 
Well Jakob, there's a reason I've only measured 1 driver on the infinite baffle. The more complicated the system (crossover, additional wiring, additional drivers and baffle effect), harder it is to isolate the problem if it is there of cause.
While I agree with you on the fact that human hearing consist of a physical and psychological elements, I find the second much more irrelevant to the testing in this particular case. The variations in your physical ability to hear on daily bases are quite dramatic, and the psychology is an icing on the cake.
The Microphone's response doesn't change significantly with the temperature, humidity and pressure variations. It also doesn't care about sleep deprivation, or the single malt scotch it had last evening.

Let me ask you this, had there been a day then you walked in to your listening room and played something and the speaker system sounded just terrible! I mean, yaaaakk, bright unpleasant, and you just wanted to shut it off? By the same token, had there been a day then everything sounded yam just fun-efin-tustic and made you smile?
Our hearing changes so much that it can hardly be trusted for testing.
How would you suggest I repeat my test with music? While I can Mic a frequency sweep, recording music would be difficult in Near Field.
Am I missing your point here?
 
SY said:


No, that wasn't the finding. The results to date have shown that WHEN differences between cables/interconnects/wires can be heard, the reasons are straightforward and non-mysterious.

This is not a minor quibble.

It is very minor, and rather mischievous. You are stating that differences between cables can be heard, without saying when. Of *course* when a cable is heroically mis-designed or mis-applied, or broken, such that its impedance is so high that it interferes with the functioning of the devices it connects, or other LCR follies lead to an altered frequency response of amp or speaker, or it picks up gross interference from RF, etc, then the problem created might become audible.

Surely the thread is not about such situations. Discussion has consistently been about the use of cables where they meet the basic engineering requirements for the application. Are such cables responsible for "glassy tone", or "complete disappearance of the bass", or subtle shifts in "timing" of the music reproduction? Answer: No. In the interests of accuracy, I will modify my statement to: we now know that *cables can't be heard if they meet the basic engineering requirements for the application*.
 
R-Carpenter said:
How would you suggest I repeat my test with music? While I can Mic a frequency sweep, recording music would be difficult in Near Field.
Am I missing your point here?

You are wasting your time to try and measure differences between cables with a freq sweep, I don't believe anybody will hear any difference that way anyhow. As I've tried to explain, hearing is much more complex than that.
 
Jakob2 said:
[BThe microphone is just one part in comparision to our hearing ability. It would be the equivalent to our physical ear (just a rough zero order approximation), but our hearing is combined out of ear and brain.
So the equivalent to our brain in your setup would be the method to analyse the microphone output.
Looking for differences in amplitude frequency wise and distorsion (to what degree?) might be not enough to mimic our brain. [/B]

Exactly, and what make our hearing even more complicated is the fact that we have two ears with very complex analyzings done by the brain. Has anybody tried to take measurements that way, say two mics at the listening position of a well set up system?

Perhaps things will look much different then.

André
 
@ R-Carpenter,

i just wanted to point out the difficulties to draw general conclusions because a microphone might be more accurate or sensitive in technical terms.

Today our understanding about the way hearing _really_ works isn´t that detailed. One of the reasons why modern work in this field still leads to surprising results.

So if you got a microphone response you have analyze this output and if your are just using test signals it becomes quite complex to draw conclusion about what a listener might hear or not.

Just for example take the amplitude variation; what range of differences were you looking for?
Lipshitz has shown that variations around 0.1dB can be audible; phase differences might have an influence, as jneutron pointed out in an earlier post in regard to ITD, a point that somewhat was backed up by findings of Paul Frindle, who found amplitude variations below 0.1dB to be audible, as localization depends on phase differences and amplitude variations.

Testing with music is difficult, i know, but i´d suggest first that you try to find out what range of audible differences (audible to you or known to be audible in general) you´re able to detect with your current measurement setup.

@ tnargs,

is there really a scientific study about audible cable differences out there?
Could you please cite it?
 
tnargs said:

Not a chance. There are plenty of differences that are easily demonstrable with instruments, but you can't hear 'em. For example 0.1% THD, or 0.1dB SPL difference, or phase differences, the list goes on. Piece of pie for instruments, mission impossible for the poor ol' limited ear.

As I've said earlier on CERTAIN aspects, the ears/brain would be hard to beat. We must search in the right places if we want answers.

Phase differences are extremely important for recreating a realistic soundstage and the brain are very good at detecting it.
 
What I find out is this, every time DBT or ABX test is suggested in order to test the difference between component A and B, the difficulties of setting up the test are brought up by cable believers and amplifier believers. It get to the point that the testing environment or test conditions become so complicated that they are impossible to execute and the a Subjectivist (call him a believer) turns around and says: “told you so, can't prove nothing, your test is bogus”

Sometimes simplicity is the answer. In my test I was looking to see any obvious difference between cable A and cable B. A and B were drastically different in construction and materials used.
I didn't find any. Combine with the fact that I cannot hear differences on musical material it is enough of an argument for me to state that interconnect cables are waste of time.

The beginning of this mile long thread is the question, is it audible?
Andre, if you are ever in NY I will invite you and set up a DBT for you and if you prove me wrong, I will be the first to publicly state it!
 
Alan Hope said:
We're going round in circles - without the blinded tests there is no point considering possible mechanisms.
1) prove that audible differences exist,
then
2) find the mechanism.

Bingo. You got it. First establish the audibility of the claim. Then proceed.

Alan Hope said:
So, when I listen to a full orchestra, and find that one cable gives one component of the sound (upper register strings) a very slight - but persistent between recordings - "glassy" tone which sounds unnatural. But another cable doesn't.

Wait, I thought we just....

Alan Hope said:
How do I measure that difference using today's technology?

Uh, LCR is today's technology. Yesterdays too.

Jakob2 said:
Lipshitz has shown that variations around 0.1dB can be audible; phase differences might have an influence, as jneutron pointed out in an earlier post in regard to ITD, a point that somewhat was backed up by findings of Paul Frindle, who found amplitude variations below 0.1dB to be audible, as localization depends on phase differences and amplitude variations.

Correct, all of which are accounted for by LCR and all of which will be extremely subtle barring grotesque differences in LCR (hence my DBT threshold comment).
None of this AWOL bass and instruments localized in neighbors back yard business. Descriptions that will be made by our heroes even when the cables have identical LCR but different appearance (hence the absolute worthlessness of subjectivist tales).
Also, as I noted to JNeutron, spatial accuracy on recorded material is a hazy affair (Now where are my cables?;) ).

cheers,

AJ

p.s. Ears "hear" ghosts, spirits and other sonic apparitions that microphones cannot (with "Today's technology"), so judge their relative sensitivity accordingly.
 
Status
Not open for further replies.