John Curl's Blowtorch preamplifier part III

Status
Not open for further replies.
Digital SPDIF out is about 110 ohms. Video was supposed to be a 75R standard, and baseband goes up to 5 MHz anyhow. So you should be able to see the difference between video cables. I have a nice Monster video cable that I use for the modulation input on a generator. It was one of the better video cables I had used. They made nice audio cables too, but stay away from the turbine cut shield types.

-Chris

I just went through my manual for a DVD player/recorder which has standard line video in/out and YPB component out to see if it had the specs and all those are rated 75R impedance. The audio input is 47K, and output 1K., no surprise there. I have a couple of yellow RCA plug video cables that are so thin that if they are rated as 75R cable I wouldn't believe it.
 
I understand the need for a proper impedance matching connector/cable assembly when transferring Mb/s signals.

For audio frequencies I'm more concerned with shielding. For that reason I normally use RG-174 (50 Ohm) with RCA connectors.

I do use RG-179 with RCA when making S/PDIF cables. Although truth be told, I have not seen any difference in short runs of S/PDIF cables when I used RG174 instead of RG-179 as I was out of RG-179 at the time.
 
Administrator
Joined 2004
Paid Member
I normally use RG-58cu for audio and baseband signals not spec'd for 75R. For that I would generally buy the cable, but one that is built with quality.

For audio, you really are concerned about the shield mostly. Low capacitance is nice within reason. I have used RG-174 for audio. Now I use it inside equipment and for test leads on the bench. BNC to BNC are always RG-58cu

-Chris
 
Member
Joined 2004
Paid Member
AES/EBU was 110R while SPDIF was specified with 75R according to IEC 60958.
Specification weren´t that tight, as the cable shall have 75 +- 26,x while the line driver output impedance shall be 75 +- 15 .
Characteristic impedance between 0,1 Mhz and 128 times maximum sampling frequency.

The wonderful history of AES/EBU- it started with a low source Z and high (10K) load Z. And then they discovered that certain lengths of microphone cables (the reason for the odd spec) simply did not work, as in the reflections essentially canceled out the signal. Quick regroup and an appropriate spec was hatched. The 75 Ohm SPDIF (and AES3) was to use existing video cabling in studios. Consumer video used RCA's for forever or since baseband composite video was let out of a box in the 1960's i think. NTSC had no real issues with RCA mismatch. In a studio setting it was an issue with long cables, daisy chaining and visibility of the reflections on a decent monitor.

F connectors are cheap to deploy and work reasonably well to 1 GHZ. With care they work to over 2 GHz. They also can do a good job of keeping the RF in the cable. Cable operators (Comcast et. al.) use quad shielded RG6 to keep the RF in and the FCC at bay. The European 'PAL" (or whatever other name it also has) works OK for RF but not better than F and will not have as good shielding. The screw down does really help.

Most consumer cables are major compromises but do work. The latest generation (HDMI 2.1 and USB-C) are really pretty well engineered and deliver astonishing performance for such cheap stuff. Not sure how long ones eyesight will last assembling them however.
 
20to20 said:
I have a couple of yellow RCA plug video cables that are so thin that if they are rated as 75R cable I wouldn't believe it.
Cable characteristic impedance comes from the ratio of inner and outer, not the actual size, so a very thin 75R cable is quite possible.

1audio said:
The European 'PAL" (or whatever other name it also has) works OK for RF but not better than F and will not have as good shielding.
That is possibly what we in the UK call Belling-Lee. Nothing to do with PAL. B-L is used for domestic TV and FM radio antenna connections, and some ancient RF signal generators. It is also sometimes seen as high impedance microphone connectors on really old PA systems.
 
The wonderful history of AES/EBU- it started with a low source Z and high (10K) load Z. And then they discovered that certain lengths of microphone cables (the reason for the odd spec) simply did not work, as in the reflections essentially canceled out the signal.

How does that work? If you have a match terminated drive there is no problem. Do you mean shorted at the source and open at the end, and real engineers passed on this?
 
Member
Joined 2004
Paid Member
Yes, that was my understanding. Of course after 25 years it may all be apocryphal.

The goal of reusing existing cables is irritable. Same for cat 3 through cat 7 which started as an effort to reuse existing phone lines for networking. And it never really worked but still made sell in easier.
 
Yes, that was my understanding. Of course after 25 years it may all be apocryphal.

The goal of reusing existing cables is irritable. Same for cat 3 through cat 7 which started as an effort to reuse existing phone lines for networking. And it never really worked but still made sell in easier.

The AES standard started as BBC project to run digital audio over the existing cable plant. It has been modified as the use expanded. Most importantly it was designed to work over normal audio twisted pair which is still the most common use. I have used it in stadium projects to distribute signals in a digital format for digital sources through the console to the amplifier rooms distributed throughout the venue.
 
Member
Joined 2004
Paid Member
I guess the story isn't completely apocryphal:

"In fact the original 1983 specification allowed up to a 2:1 mis-match of the line characteristics and this gave a
certain flexibility to "loop through" receivers, or use multiple links radiating from transmitters. This concept
was based on the theory that lossy PVC analogue audio cables would be used and it was predicted that:
· reflections in short cables were unlikely to interfere with the edges of the signal, due to the short delays
involved,
· reflections in longer cables were likely to be attenuated so much that they would not significantly interfere
with the amplitude and shape of the signal at a receiver.
In practice, however it was soon found that problems occurred with an open ended spur which happened to
have an effective length of half a wavelength at the frequency of the "one" symbol. This length is also a
quarter wavelength for the frequency of the "zero" symbol. This condition causes the maximum trouble for the
signal characteristics on any connection in parallel with the spur."

So even engineers who are supposed to understand this stuff guessed too loosely. I have been burned for taking a few shortcuts but not this bad.
 
In fact the original 1983 specification allowed up to a 2:1 mis-match of the line characteristics and this gave a
certain flexibility to "loop through" receivers, or use multiple links radiating from transmitters.

This whole thought process is amazing, even the worst 50 cent "yellow" video cable included with every VCR could give horrifying images if attention to termination was totally ignored, the losses per se are not particularly relevant at a meter or two.
 
Status
Not open for further replies.