Does the cable impedance matter when making audio level measurements?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
A question has come up at work, and I couldn’t find a good answer so I thought I would ask here:

Does the cable type, or cable impedance, matter when taking audio level measurements?

For instance, if I am measuring the audio level coming out of a radio receiver, I will use a transmission test set that terminates the audio line in a 600 ohm impedance. Does it matter if I connect the test set to the receiver with coax, twisted pair, a piece of CAT5 wire, etc.? Does it matter if it is 50 ohm coax or 75 ohm coax?

My opinion and experience is that at audio frequencies, the impendence of the cable doesn’t matter – because at audio frequencies, the test cable is not an RF transmission line.

Any thoughts would be appreciated – or even better, authoritative statements or references. One of my coworkers is insisting that you must use 75 ohm coax for connecting up the audio test equipment because that is what the old-timer who taught him radio maintenance said. My experience is that it doesn’t matter.

Thanks for the help.

Ben
 
Member
Joined 2009
Paid Member
Short answer: No.

The electrical wavelength (not to be confused with the acoustical wavelength) of 20 kHz is 15 km. Unless your patch cables are longer than, say lambda/10 or 1.5 km, I wouldn't worry about it. If you wanted to match drive impedance with the characteristic impedance of the cable you'd use 600 ohm cable (or a 50/75 ohm driver and 50/75 ohm cable). Using 75 ohm cable rather than 50 ohm in a 600 ohm system will not make any practical difference. Neither impedance is a good match. Besides, any reflections caused by cable mismatch will die out in a few hundred ps and not show up in your audio measurements regardless of the cable impedance.

What you do have to worry about (to some extent) is the cable capacitance. If you're using long cables you might load the driver by several nF of capacitance. That will impact performance and can cause real trouble with instability. But 50 ohm is likely to have a similar amount of capacitance as 75 ohm coax. It'll be slightly different as the dielectric is different. But not enough to matter for practical purposes. RG-58 (50 ohm) is 100 pF/m; RG-59 (75 ohm) is 68 pF/m.

~Tom
 
In any case, cables (transmission lines) don't have a single Characteristic Impedance in the audio band. The formula we often see for "Characteristic Impedance" is a simplified version of the complete formula. This simplified version works well at radio frequencies, but not at all at audio frequencies.
 
At audio frequencies the RF characteristic impedance of a cable is virtually meaningless, and irrelevant for all except the very longest cable runs. For all reasonable lengths the cable capacitance matters; how much it matters depends on the output impedance of whatever is driving the cable.

Note: I really mean output impedance, not the matching impedance. For example, an audio preamp may say that it can drive an amp with impedance from 10K. 10K is not the output impedance of the preamp, which will probably be well below 1K. Apologies if you already knew this; many seem not to.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.