Diy spdif cable.

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
If you want to make a good "low jitter" SPDIF cable, here's my recommendations:

- keep the impedances matched... USE 75 OHM CABLE! CAT5 is 100 ohms, which means you'll certainly get reflections on the cable.

- Connectors usually don't have impedance specs, unless they're physically large connectors that are designed to move high frequency signals. If you're using largeish connectors (eg the right-angle PCB-mount BNC's from AMP that are about 1.5" long) then make sure you use the 75 ohm ones. If the connectors are physically small and don't have a spec, then don't worry about it.

And F connectors are good! The ones on the back of chinese TV sets aren't, but on most head end equipment at cable companies you'll find super-high-quality, sometimes even gold plated F connectors.

- For long cables especially, make sure the cable you're using has good shielding and little attentuation. You want the magnitude of the signal at the end of the cable to have as high a SNR as possible; after all, introduced noise can push around the detected edge of a clock.
 
- Connectors usually don't have impedance specs, unless they're physically large connectors that are designed to move high frequency signals. If you're using largeish connectors (eg the right-angle PCB-mount BNC's from AMP that are about 1.5" long) then make sure you use the 75 ohm ones. If the connectors are physically small and don't have a spec, then don't worry about it.

Hello............ you need to check out the parts catalog for BNC's
 
jewilson said:


Hello............ you need to check out the parts catalog for BNC's
Find me a RCA jack with an impedance spec ;)

BNC connectors almost always have impedance specs, because they're used well into the GHz range where a mismatched connector impedance can become measurable.

What I meant to imply was that at ~6-7MHz SPDIF frequencies, the connector impedance doesn't mean a whole lot unless the connector is large (eg the AMP connector I mentioned). Of course re-reading my post, this isn't all that obvious. I shouldn't post before my morning coffee... :D
 
Jocko Homo said:
You can hear the effects of 50 ohm connectors in a 75 ohm SPDIF setup.

Type F connectors can look good.......since the centre conductor is used for the pin in the male connector. A technique used for a lot microwave stuff.

RCAs measure in the 25-30 ohm range.

Jocko
really... I haven't done any calculations on this (maybe i should) but the impedance mismatch caused by a 1 inch/50 ohm connector at 6-7MHz will be *very* small. I'd expect a much greater audible effect from the tolerance of the 75 ohm terminating resistor in the receiver than the impedance of the connector.

Which brings up an interesting point; would replacing the resistor on the input of a DAC create an audible improvement? And has anyone ever hooked up a vector impedance meter to the SPDIF input on their DAC? (which someone really should!)
 
Yes, I know......

My training as an engieer says that at frequencies below 300 MHz or so, it should not matter that much.

But......my ears.....those pesky things that can confuse us, tell me that you can hear it.

Vector meter?.......yeah, you could. But that is what TDRs do best.

And, yes......I have.......countless times over the last 13 years.

Most inputs look nowhere near 75 ohms....even the ones with 75 ohm terminations.

Jocko
 
Disabled Account
Joined 2004
Keep it simple

You don't need to do any very esoteric measurements of calculations to see the reflections from 50 ohm BNC or ever worse a "75 ohm RCA" (there is no such thing but the new WBT RCA connector for video and digital coax look like a pretty good shot at it).

I have often used a method I call "real life TDR." Make you 75 coax about 10 or more and measure the reflection coming back from the load to source. The longer cable gives enough delay for the reflection to have enough delay to clearly separate it from the incident waveform. Drive this set up with a circuit that has a similar logic family and rise times. You can easily see enough detail with a 100 MHz to manipulate termination connectors, impedance mismatch and wiring effects. There is lot to be said testing with the waveforms the circuit will see in actual use rather than interpolating from a test signal that is much faster the one it will actually see. I have seen Jocko do TDR but lack his familiarity with the test set up and resultant waveform and how the it correlates to logic risetime seen in the actual interface is not something I posses. In this light I often feel like I am trying to tell what type of tree I am looking at by placing my eyeball about an inch from the bark without the benefit of being a tree surgeon.

Call me McGuyver but I like to make measurements with stuff pretty easy get a hold of and or build, a real handy method for
the typical DIYer, has a good sound card and Spectrum analyzer software from the web, and a scope, and has never even seen an Audio Precision test set up in real life. With much the test equipment out there today I can build a test set up from stuff around the lab faster than I can figure out the user interface software for the test equipment. I still have a theory that HP test equipment was so difficult to use because they felt you couldn't figure the user interface out that you were not worthy to use their test equipment. The knuckle heads just can't get it through their heads that you might only use the equipment every few months not like to spend four times as long as reading the manual as making the measurement.
 
I always believed that our ears lie to us all the time, and as such are not to be trusted as a "final authority" on much.

Case in point: I have a friend who fancies himself a real audiophile. Just to see, I gave him a speile about some cables, spewing pop Quantum Theory, and other such nonsense. I then did a "blind test" (He couldn't see) where I played a CD track using 3 different sets of cables. The first set was a regular, reasonably high quality set which were used from the CD to the DAC, and a nice set of RCAs from the DAC to the pre-amp. He was told that this was his "reference cable" -- ie: the one that he used normally.

We did not tell him which the other two sets were, but the next set we allowed "slip" a comment about cheap connectors while installing a $3 set of cables from Radio Shack. The comments were about the lack of detail and harshness of the sound. Okay, maybe these cables WERE inferior...

The third set was YET ANOTHER $3 SET. Here I muttered something about "careful with that one" while connecting it. My friend couldn't gush enough about the "openness" of the sound, and how superior in detail this was, and how these cables brought his system to life. Was he EVER ****ed at us when we removed the blindfold and he saw those cheap things attatched to his DAC. My partner in crime videotaped the entire "listening test"

Just an illustration of how our ears can lie to us. The placebo effect is alive and well in the audio world. Many a testimonial can be found insisting that some crap which has absolutely zero measurable effect can make a dramatic improvement in sound quality. Many more can be easily found stating that some product which has a distinct and measurable negative effect makes huge improvements. Subjective improvements seem to be directly proportional to the cost of the product.

For the subject at hand...

Cables should have as low an impedance as possible. I used to deal with coax network cable all the time, and the "good" cable always had very low impedance. Lower impedance cable could be used to make longer runs, and had lower instances of packet errors. Connectors had as close to zero impedance as possible, regardless, and for the on-board stuff, that wasn't something that could necessarily be "matched." But for the record, I've disassembled a number of old network cards, and there was no kind of matching on the BNC mounts, it just got the wire to the PCB, where the signal was routed to a decoder.

They were called "50 Ohm" cabling because there were 50 Ohm terminating resistors on either end of the chain to prevent reflections from causing more packet errors. I was so glad to rip out my last client's coax network years ago, and replace it all with cat5. :D
 
HUH?????????

How can a connector have "as close to zero impedance as possible"? There is no such thing.

And why does cable TV use 75 ohms, instead of 50 ohms??? More loss, according to your logic......................

But you are right about the ears. Back when I built stuff full-time, I needed to find testers who actually could hear, but knew nothing about what I was doing to prevent bias.

The number of candidates who thought that they could hear a difference when there wasn't one was annoying.

Just because they heard me throw a switch didn't mean that I was actually swithcing the signal.

(I wasn't. It was on an idle piece of equipmnet. Yeah, I am a PITA!)

Jocko
 
maybe a dumb but related question: if I have a DAC that is small enough to fit into a transport chassis, what would happen if I hardwired SPDIF out to SPDIF in - no connectors, just a 6" short Coax cable maybe?

I'm sure there are better ways to connect DAC to transport if its in the same box, but if I have no idea how to tap into signals and how to mod the DAC to take the direct IS2 or whatever it's called, I am really limited to use the SPDIF.

Somewhere I read that 1.5 meters is the ideal caox cable length for SPDIF due to reflection decay, etc. How would doing the short direct link without connectors do in that respect, or is it really the connectors that are the issue and can I bypass the whole problem by hardwiring the DAC to the transport, short or long cable?

Peter
 
Jocko Homo said:
Who came up with 1.5 m??? The connectors are not the problem, and the worst sounding D/A box that I made had a 6" coax into it........

If they are in the same box...........there are ways around using a PLL to screw up the clock. That is the problem with SPDIF.

Jocko

so if 6" was bad, maybe there's something about the 1.5 meters?

this is where I saw it :

http://www.positive-feedback.com/Issue14/spdif.htm
 
TV uses 75R cable for historical reasons. Antenna co-ax cable is 75R simply because that's a good match to the impedance at the centre of a resonant half-wave dipole. (similarly for 300R twin-feeder for conection to a resonant folded half-wave dipole). They simply carried on using antenna cable even for the baseband video links.

You can detect the difference between 50R and 75R co-ax, even over short lengths using a TDR, but it is hardly possible to see the impedance mismatch between 75R and 50R BNC connectors.

Like many other people, I hate the idea of using RCA (phono) connectors for digital connections but they're here to stay for a while by the look of it. My choice woiuld be SMA gold-plated screw-lock, but who would be willing to pay the price?

Chris Morriss.
 
Re: HUH?????????

Jocko Homo said:
How can a connector have "as close to zero impedance as possible"? There is no such thing.

And why does cable TV use 75 ohms, instead of 50 ohms??? More loss, according to your logic......................

----------------------------------------------------------------------------------

All TV and video in Europe use 75 Ohm. Pity they use crap ultra cheap connectors though.

Koko, thought you were abandoning this forum?



:bigeyes: :bigeyes:
 
I'm also currently messing with digital interconnect problem.

At the moment my setup is following

Sony CDP-XB930 spdif tapped from transport PCB -> divided to proper amlitud by resistor network -> LM6171 video opamp as unity gain buffer with 75 ohm resitor in series feeding the 75 ohm BNC connector -> 75 ohm coax cable -> straight to Behringer DCX2496 DSP board -> Schott transformer -> 75 ohm resistor between secondary windings -> Crystal CS8420 receiver IC.

First I tested the setup so that the digital cable was not soldered to DAC, but instead to BNC connector with 75ohm resistor to ground. This connector went straight to oscilloscope input. When proper syncronisation was found the signal looked very good with minimal overshoot and no visible reflections.

I tried the setup with different transport also (cheap end Technics player). The BNC->RCA plug was used and the spdif output on Technics was in it's original form. The sound of the two systems were very clearly different. My own setup had much more air and deeper soundstage - so I must be on the right track.

Thank you goes to Jocko also. I was bombing his e-mail box a while ago picking his brain :) for possible implementations.

I quess there would be room for improvement especially in RX end and I intend to do it as soon as I get the analog stage on bar with the current level of digital end.

Regards,
Ergo
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.