Audiophile Ethernet Switch

Status
Not open for further replies.
You guys do realise that the player software pre-fetches and buffers the music, right?

You don't say :p

It also loads a decoded file in memory. And the dac has a Fifo and independent clocks in front. So, yes, on this front we are most likely all set.

But what has any of this to do with improving the switches for audio? Nothing at all? Getting a bit tiresome to read platitudes and "bits are bits" revelations.

Nobody is thinking there are bit or timing errors, this was acknowledged many times in this thread and you continue going back to the same lame arguments.
 
I can understand that some switches might inject less rubbish into the ground of the DAC - although the DAC ought to cope with this.

I can't comment on networked-connected DACs because I don't have one. As noted, the LAN I was discussing connects a server to a NAA, not to a DAC.

Whatever, drivers for 1,000GB/sec are, as you know, orders of magnitude more processor intensive than those for 10/100 MB/sec. I presume it's the reduced load that improves the sound (IOW, improves the integrity of the outgoing and real-time data stream) at the slower speeds.

Even at 10MB/sec, the likes of 88KHz/16-bit data need barely a quarter of the available bandwidth; I see no point in running faster.

I cannot see how changing a 3.3V regulator can have any effect on this, unless the old one is oscillating or something similarly pathological.

I'm sorry you can't see it but I can assure you that the change is as marked as it's repeatable. Nor was I greatly surprised by it after someone with significant LAN-related expertise suggested I try it. (I should have thought of it for myself.)

Long experience in a non-audio field taught me that computers driving real-time processes give rise to issues well outside the comfort zone of many otherwise competent technicians.

Not for the first time, even this thread shows that many prefer to snort and sneer rather than stop and think. Instead of saying "anyone who reports stuff like this is deluded/ an idiot/ not neeeeeearly as smart as I am", why do folk not wonder if there might not be something in the reports and that - Heaven forfend - one might just learn something?

They could e.g. usefully offer a critique of the two reports I cited above. They include measurements even. (Sadly, the measurement were made by engineers with pertinent skills and antsy-fancy kit. Seems they didn't know that all they needed was a low-grade PC soundcard and that "Audio is sent the same way as any other data sent across an IP network".)

D
 
Ryelands said:
Long experience in a non-audio field taught me that computers driving real-time processes give rise to issues well outside the comfort zone of many otherwise competent technicians.
Now that I can agree with, having similar experience myself. As a result, I would never trust a general operating system to do anything time-specific - including sending audio data at some particular time. Hence it cannot matter when the data arrives because in a good system it does not matter when it was sent.

My experience in industry also taught me that people are very good at fooling themselves, especially when they believe they are immune from this.
 
You don't say :p

It also loads a decoded file in memory. And the dac has a Fifo and independent clocks in front. So, yes, on this front we are most likely all set.

But what has any of this to do with improving the switches for audio? Nothing at all? Getting a bit tiresome to read platitudes and "bits are bits" revelations.

Nobody is thinking there are bit or timing errors, this was acknowledged many times in this thread and you continue going back to the same lame arguments.

It has everything to do with it. It makes the alleged effects of a switch irrelevant.

Edit: I'm deleting this tread because I fear I may post something that will get me banned for calling some for what they are.
 
Finally did a little research.

1. The majority of modded switches are based on the Dlink DGS 108. For some reason (cheapness?) it makes a good base for experiments.

2. Apart from Aqvox, who use black and white goo, everyone else is addressing PS, clocking and shielding.

3. A ground terminal seems to be a good idea. Presumably it goes to a grounding box where the streamer and dac are also grounded.

4. The use of an expensive OCXO is really surprising for me. Cannot see the point of nailing the frequency so accurately. Otoh, none of the modders are likely to be throwing money away. If anyone has a wild idea why a precise frequency is beneficial please share.
 

Attachments

  • The Linear Solution OXCO Switch.jpg
    The Linear Solution OXCO Switch.jpg
    554.4 KB · Views: 560
The OCXO serves no purpose for clocking in an ethernet switch because the devices synchronise their receiver clocks from the preamble and start frame delimiter at the start of data transmission.

Same PLL clock recovery is used on SPDIF receivers. However, I still remember the sound improved on my old DAC when I upgraded the stock XO to a TCXO. That 11.2896M TCXO was feeding into YM3436. The connection between them is very interesting.
 
Ethernet data is always delivered clean i.e. unpolluted by data errors. The timing doesn't matter; if it causes problems then your buffers are too small. Now you may be saying that this switch somehow reduces ground loops or other electrical problems which may affect the recipient of that clean uncorrupted Ethernet data, but it certainly does not improve the data in any way.

Correct. However, there is little chance of a ground loop, as ethernet is magnetically coupled.
 
I still remember the sound improved on my old DAC when I upgraded the stock XO to a TCXO.

Perhaps not a good analogy. The TCXO in your dac was directly responsible for the data clocking. Why a TCXO? I guess because at the the time there was no choice of low phase noise clocks and coincidentally TCXO were better sounding than the cheap crystals.

That ethernet clock has no connection whatsoever to the dac clock, so whatever mechanism is at play is likely something different.

Pink Faun are using exactly the same type VCXO (Connor Winfield) as audio frequencies clock. At 10Hz it has about the same phase noise as the popular and cheaper Crystek 957. Not sure i understand the usefulness of a VCXO in that case either.
 
People often 'upgrade' clocks because they can. Whether the 'upgrade' has any useful electrical effect is a matter of debate in some circumstances, and known not to make any difference in other circumstances. In many cases the 'upgrade' may be from an oscillator with good short-term stability (i.e. jitter) but slightly poor long-term stability (and accuracy) to an oscillator with slightly poorer short-term stability but better long-term stability; thus they have swapped things in the wrong direction, as it is short-term stability which matters for audio. Anything which makes a clock adjustable, including voltage control and temperature stabilisation, is likely to degrade short-term stability.
 
Ok, finally i have a hypothesis :)

It seems many of these vendors also sell ethernet cards and streamers. All of these have also been modified with OCXOs. It is perhaps pointless to have a very accurate frequency reference in a switch, but once it gets to work with other modified equipment it starts making sense. Not sure how the synchronisation mechanism works, but having tightly controlled clocking means this mechanism will have to work less hard. In the case of a PLL this means less hunting.
 
I am not sure it needs precise frequency, or gets any benefit from precise frequency. It is a long time since I studied Ethernet, but I seem to recall that the clocks at the sender and receiver are not locked together in frequency but merely close enough in frequency so that if they start out together then by the end of the packet they are still close enough to get correct data slicing. The specification will say what this frequency requirement is. The packet preamble gets the two data clocks aligned, that is, it is more a matter of aligning phases than aligning frequencies. It could even be as simple as having, say, 8 versions of the clock with different phases and choosing the one which recognises a preamble. There may be no PLL and no hunting; nothing to adjust. The clock is either within spec and will always work, or out of spec and will sometimes fail. Hence there may be no advantage at all in having a 'better clock', apart from a nice warm glow inside the user and an even warmer glow in the bank account of the seller.
 
Status
Not open for further replies.