TOSLink Cables All the same? Any degradation over Length?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Hi,

I'm more than willing to abmit I don't know what I'm talking about... but that's the reason for my questions:

Why would I spend more $$$ for a "premium" TOSLink cable over a budget cable? Is there really going to be a difference? I mean, this is *digital* information over *fiber optic* cable... is there really going to be any signal loss whatsoever over this medium?

Why are TOSLink cables advertized as "shielded?" Shielded from what? Some even say they are RF shielded.... what? How could radio signals interfere with *light*?

Someone please set me straight, because if my understanding is correct there are a lot of marketing gimmicks and shams going on with these cables and I'm inclined to buy the cheapest cable I can find. Seems the manufacturers are exploiting consumer's expectations that these cables will suffer from the same problems affecting traditional copper wire cables that carry analog electrical signals.

This is a Home Theater/Audio issue that has always bugged me, and I've come here for an explanation.

Oh - on a related note... would the signal be adversely affected when it's running over, say 50' of TOSlink instead of 3'? I mean, wouldn't the same lossless signal make it that far undegraded? (Or am I incorrect to assume that digital TOSlink information is lossless?)

-Schmanthony
 
Toslink

The only plausible thing I've read that could make a difference with toslink is that bad ones might exacerbate jitter. Jitter appearently can be measured. The only question is is it audible. I'll leave that for others.

RF shielding is IMHO like de-magnitizing CDs, manifestly silly.

Baring jitter, unless an optic cable is inducing more errors than the error corecting algoritms can deal with, they ought to all sound alike. Bits is bits.

What may be a more realistic concern is fragility. You can break the suckers. At least if this happens the effects won't me debatable you just won't get sound. Nice thing about didital it work or it doesn't (baring jitter).

BTW I bar jitter because I'm not teky enough to talk intelligently about it. I'll grant that it's been measured but it's an open issue with me still if it is audible. I doubt it, but I'm willing to consider the possability.
 
Sheilding = B***********(X) as long as it is not optical!

Every digital circuit creates noise = jitter.

How much BETTER is one particular transmission technique? I don't know.

I use TOSLINK with a very cheap nylon fiber (3 mm) and I sounds splendid together with my DAC.

BTW: TOSLINK works at least 25 meters with normal operating currents and a simple nylon fiber. The light is noticable weaker than with a short fiber.

Don't forget that many DAC's "eats" jitter, like mine. (CS8402, CS4328).

(X) I try to write "tjurskit" without the "ullshi" but even that is filtered out! OK, my mouth is filthy. I change to dumheter instead. Since we discussed language elsewere everyone understand when I write in swedish ;) .
 
jitter

Hmmmm.... jitter,

All I know about that (very little) has to do with ripping audio CDs. When your CD-ROM is reading bit by bit data that is supposed to be streamed (I think...) sometimes a dirty, scratched, or warped disc is likely to cause significant distortions in the transfer process. Much more than what you would hear if the data were actually streamed (since the laser in the CD-player would just skim through it), instead your cd-rom drive in read mode is trying to read those "dirty" bits one-by-one and loses track or something.

I'd be willing to bet that jitter in CD's is usually inaudible... and the same thing would probably go for DVD audio as well. If you ever hear anything wrong at all, then it's probably *really* bad and no cable will save you from that.

It seems like the first thing to "go" on a less-than-perfect DVD is usually the picture. I've never heard the audio start crumbling before the picture does.

So you might have a point about better TOSlink cables inducing less errors, which could be additive with respect to errors coming from the reader like there might be from jitter. I kind of doubt it, though. I also have a feeling it wouldn't be economical for manufacturers to engineer different "grades" of fiber for consumer home audio. However, engineering marketing gimicks has the potential be highly economical! I think they'd just make one product that works. (Like you said, either it works or it doesn't). I'll bet that any dressing up the premium product gets is probably for sheer profit margin and nothing else.

I really want someone to tear me apart and tell me why I'm wrong. Until then I'll be buying the dirt-cheapest TOSLink cable I can, and be happy knowing I'm doing no worse than the guy who paid 3-4 times as much for that monster-gold-super-shielded-premium-deluxe malarky.

-Schmanthony
 
Re: 25m

Schmanthony said:
peranders,

Yeah, now that I think about it the light would get weaker further away and there would be more error in the signal.

But you think up to 25m would be OK? If so that's good for me because I only need about half that for my current project.

-Schmanthony

You don't get more errors but if you attenuate the light even more it simple stops working. It's very clear when the signal is too weak. Very audioable and very easy to see on an oscillscope. You don't ones and zeros at the maner as from a bad CD. The transmission link is very "digital", it work perfect or not at all.

10 meters is the garanteed length with TOSLINK. I think the limit is around 25-35 meters with nylon. Glass fiber is better but I don't how much.
 
Schmanthony said:
Hi,

Why would I spend more $$$ for a "premium" TOSLink cable over a budget cable? Is there really going to be a difference? I mean, this is *digital* information over *fiber optic* cable... is there really going to be any signal loss whatsoever over this medium?

Why are TOSLink cables advertized as "shielded?" Shielded from what? Some even say they are RF shielded.... what? How could radio signals interfere with *light*?
------------------------------------------------------------------------
You should certainly try different Toslink cables as there is marked sonic difference, partially because of the optical interface. Some respected cables don't sound good to me at all!

Shielded can refer to shielding from microphony. Also there is the claim that attention to minimise boundary reflections help.

You can always covert your Toslink connection to SPDIF with a BNC connector.

]
 
sam9

Setting aside the question of "jitter" for a moment, if two data streams after having error correction appplied are the same (identical) they will sound identical when fed to the same device. There is no room at all for subtle differences. The cable carrying one can be a $1,000 marvel and the other a piece of crap - only the bits count.

The only case I know of where someone reported feeding a digital signal into a device with an error rate monitor, the reported error rate was zero. The cable was a length of coat hanger!

Taking up "jitter" again, I was surprised to learn not to long ago that most DACs make no attempt to buffer and re-clock the incoming data stream. I would have thought this would be a fairly basic precaution at least for +$1k preamps and recievers, but I guess I'm a bit naive. So anyway, it seems it is possible variations in the arrival rate of the individual bits to induce distortion. I'm not aware that anyone has ever unambiguously demonstrated that the distortion is audible. What little reading I've done on "jitter" suggests that finding a way to measure or even confirm it's existance was a real bear. This difficulty itself suggests the effect is quite small. While I can easily see how jitter could be introduced in the transmitter due to slight instability in the clock crystal, I've not seen anything plausible about how a toslink cable might speed up one bit and slow down the following one I would expect each bit to be subject to the same interaction (if any) that every other bit is. (Maybe there is some sort of bunching interaction analogous to what happens in a linear beam tube, i.e., TWTs, kystrons. If so I'd be fascinated to read the explanation.)

Conclusion (mine anyway): The chances that there are audible but non- catastophic differences between toslink cables seems to me so small as to be safely disregarded. (Catatrophic mean error correction fails and you get either silence or a horrendous blaat.) The chances of there being audible differences between toslink and high grade glass fibre or coax seem slightly higher only because there are more variables between different type cables, but "slightly higher" is still not likely to be significant, as in audible.

For the curious, I have swapped out toslink cables and though I head some difference. Price didn't have anything to do with it, but I noticed the brightly colored ones seemed to be the ones to sound better! Being not without some sense of humour about my own human frailties, I've concluded that I was listening with my eyes. Once tucked out of sight and forgotton everything sounds fine. I only worry when I look behind the cabinet!
 
What little reading I've done on "jitter"

"The chances of there being audible differences between toslink and high grade glass fibre or coax seem slightly higher only because there are more variables between different type cables, but "slightly higher" is still not likely to be significant, as in audible."

Possibly the most ridiculous thing I have read from a poster of your intellect. There have been dozens of articles on jitter published. I think I might read a few before making such sweeping statements.

http://www.google.com/search?hl=en&ie=UTF-8&oe=UTF-8&q=clock+jitter&btnG=Google+Search

http://www.stereophile.com/showarchives.cgi?368



H.H.
 
The definative answer (hopefully)...

OK, I'll see what I can do about answering your question.

The two methods of transfering data each have their own advantages and disadvantages. On the one hand Coax can, if not properly sheilded or of low quality (not necesseraily low cost), be troubled by interference as the wire can act as an arial, also their are capacitive effects between the cable and the sheild that affect the signal. On the other hand optical cables do not suffer this problem although distortion can be implemented onto the signal by both the transmitter and reciever (where the electrical signal is transduced to an optical signal or vice-verca).

The other main thing to remember about an optical signal is that the "light" can take many paths down the optical fibre as the fibre is much wider than a single "photon". Therefore rather than recieving what would be a variation of a square wave signal (for a coax cable) the recieved signal is actually a series of curves (rather than swuares) that follow approximately normal distribution (symetrical curve) - if the cable is longer the difference between the longest path down the cable and the shortest path is much greater and therefore the curve is more spread out. It is this spreading out that can cause an affect like jitter - where the signal pulses are less defined and less easy to reference and therefore the DAC may have problems differentiating between pulses.


Although for this to happen on a significant enough level the cable would have to be quite long!! - - The optical cables used between the USA and the rest of the world under the Atlantic use booster stations every 15 miles. Although this is primarily to boost the signal as the "light" would have decreased in intensity due to minor flaws in the cable one must assume that the signal is still recognisably accurate at this point - maintaining enough integrity to be dulicated and re-transmited.


I hope this helped?

I was going to draw pictures to attatch to this post, but unfortunately I don't have any webspace to host them - if anyone wants a picture to help explain leave your email address and I'll see what I can do.
 
Ridiculous intellect ?

At least you conceed I have an intellect. I'll pass that on to my wife who has her doubts.

Your citations seem to confirm some things I said.
A: Jitter is measurable
B: It is bear to do that measurement - somewhat less these days, I'm sure, now that the means have been worked out.

Some citations didn't seem especially germaine as they address strictly data processing and computation concerms where Gigaherz is the word of the day.


One of the google reference directly addressed audibility and found that comparring no cable to a 100 meter cable showed audible differences. This would be something I would keep in mind if I were setting up a rock concert or other large public event. I might even think about it if I were in charge of wiring of the sound through Bill Gates house. (I don't mean this facetiously. They would be real issues if I were engaged in those activities. Similarly I don't concern myself that I use unbalanced rather than balance interconnects for the reason that the maximum length is under 10 feet.) However, this says nothing about the choice of two shortish toslink cables of equal length.

Note, that at no time did I make a flat statement that I thought there were never, noway, nohow any audible differences between toslink cables. Rather I said I felt that what differences there may be be could be safely disregarded. This means that the differences are not likely to be discerned under normal listening. Maybe the dreaded ABT could draw them out, but I confess to not having the patience to mess with one of those.
 
Although for this to happen on a significant enough level the cable would have to be quite long!!

NO!!!!!!!!!!!!!!

It is worse on a short cable.


I'm not aware that anyone has ever unambiguously demonstrated that the distortion is audible.

Yet more disinformation.

For the curious, I have swapped out toslink cables and though I head some difference.

So.......if you heard a difference, and noting the types you heard, then why do you cling to the belief that jitter is not audible?

Jocko
 
Hmm... I think annex666 and Jocko are thinking about two slightly different things here...

annex666:

Ever hear of single-mode fibre? If you've ever done anything with microwave or RF waveguides, you'll understand what modes are... basically they represent the number of possible pathways a given EM wave can take down the waveguide. In the case of optical fibre, the EM frequencies are very high, and the waveguide consists of a very thin piece of glass, but the basic principles are identical. Single-mode fibres are built extremely thin, and will only support a single "path" through the cable. Any time dispersion of the signal which occurs in the cable is due only to imperfections in the fibre. I'm no telecom expert, but I believe that single-mode fibres are used for all the high data rate long-haul underwater links and so on, specifically with the intent to eliminate or at least reduce this multipath time dispersion.

Now, I really haven't bothered studying the TOSLINK interface specifically, as I believe it is inferior to a properly implemeted coax link... a belief which i base purely on my technical understanding of the two technologies. So, I really don't know if it uses single-mode fibre (I suspect not). So, the effect of multipath time dispersion in the fibre is debatable... even more so due to the extremely short lengths used to connect home audio equipment.

<hr>

For digital audio interfaces, there is a different source for most jitter, which I alluded to earlier and which Jocko is referring to... and the same basic rules apply to both glass and coax: any impedance irregularities (aka index of refraction changes) will introduce a certain amount of energy reflection at the impedance change. Generally, these impedance changes are localized at the source and receiver, and so some energy bounces back and forth between the two, messing slightly with the voltage or light levels - potentially adding to uncertainty in the timing of the logic level transition... jitter. This is basic engineering stuff here folks - nothing magical.

In an optical system there will be additional jitter added by the intrinsic noise of the optical transducers at each end, not to mention the possibility of low "slew rate" etc.. Compared with a coax only transmission line, there are more potential variables to introduce jitter. Also, it seems to me that a coax system has the <i>potential</i> for more accurate impedance matching to be done at each end, or for the DIYer, at least at the receiving end where it is arguably most important.

For both methods, one must also think about the jitter produced by the receiver's PLL... yet another variable to consider.

sam9:

Jitter effects are easily measurable and quantifiable. I don't know where you read that it is a bear to measure jitter... you can do it with a PC sound card! Although I really don't care to dig everything up, the calculations to show the acceptable level of sample-to-sample jitter in an LPCM system are relatively simple, and the timing margins are very narrow indeed. If memory serves me, something on the order of a mere 200ps of jitter is all it takes to represent a single LSB change in a 16/44.1 LPCM sample. Since jitter in the SPDIF interface is often data correlated, it is very easy to see how much impact it can have on the signal. Consider the well supported fact that 24 bit PCM sounds considerably superior to 16 bit or even 20 bit, or that mysterious as-yet unexplained and unmeasurable effects of "high feedback" amplifiers are purportedly responsible for audible degeneration of the sound (i stance on this matter is another subject entirely). These represent some very miniscule differences in the effective "resolution" of a musical signal. At present, jitter is one of the primary limiting factors in digital audio reproduction. To me, it is by no means a stretch of the imagination to understand how jitter can be audible. Food for thought...
 
Error correction? What error correction?

The S/PDIF format has at best a parity bit. This is *NOT* an error correcting code! The big difference between S/PDIF and other "reliable" digital protocols is that S/PDIF is unidirectional with no flow-control.

In other words, even if the receiver knew that a frame was bad, there's really nothing it can do to reliably fix it. This is in comparison to say Ethernet, where you have a 32-bit CRC (plus any additional checking in the payload) and enough bandwidth to retransmit any bad frames.

S/PDIF is based on AES/EBU which is a real-time digital transmission protocol for use in studios, often over high quality balanced cabling. With that in mind, the protocol designers decided they could sacrifice error detection / recovery for better real-time response. After all, studios could better afford the expensive cabling and equipment.

When the RIAA and MPAA can finally get off their money grabbing pedistals, we'll start seeing S/PDIF phased out in favour of IEEE 1394 (aka Firewire or iLink). IIRC, this is a bi-directional connection with error detection, retransmission and enough bandwidth to make good use of flow control (ie. send data faster than it is played, stop when the buffers are full, send again when the buffers reach a low watermark). At this point, short of really shoddy cables, jitter will no longer be an issue.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.