Data quality: streaming via wi-fi vs internal makes any difference?

Status
Not open for further replies.
That is subjective opinion. I don't meant that it is not true, but CD players drop a hell of a lot of bits when paying back audio CDs and many people don't seem to mind or notice.

I have a friend with a $2000 tube amp that hums like a motor through one channel, and he doesn't mind that. He will buy the highest specification recording and play it back through a system that has 2V of hum and 15% distortion.

I also have ones who point out subtle differences between left and right channels caused by room reflections, when playing mono recordings. I cannot agree that bits dropped *will* be audible, though I agree that it *may* be.
 
I owned only a NAD c521 before switching out of players completely.

Anyway, the same CD when recorded using digital output could not generate CRC-matched files after three tries.

I gave up after the third try.

The OP is using asynchronous USB. How does your 100uS glitch foil that?

Over Async USB possibility of errors is lower if the driver controls the stream and is well-coded.

However glitches are not guaranteed to never foil the stream.

My Benchmark DAC2 also uses async USB and in my recent correspondence with tech support over a glitch in the XMOS recognition, it is noted to users that USB errors can and do occur and can and do interrupt the audio stream.

This is Benchmark, not some audiophool company.
 
Last edited:
bill like I said, what is audible or not is moot. The fact is there are people who will look at that 1 bit and make a fuss about it, and for others it won't matter if it is half the CD.

The point I'm trying to make is that this debate will go on but the fact is that differences that do not exist in a theoretical framework may exist in a real system.

And that one shouldn't think a person chasing down that last bit is a fool.
 
How does trying to compare a capture off spdif for CRC compare with USB-DAC performance off Jriver? No one is claiming that you can get a bit exact stream off a CD player (note not streaming off CD) every time. If someone said CDs sound better when ripped and played back on a USB DAC i would not have a problem seeing how that might work. NAS vs internal hard drive is giving me problems with mechanism.

Chasing down that last bit is fine, but understanding where it might get lost is a good starting point.
 
However since Wifi uses resources on less capable parts of the PC subsystem (such as the USB PHY or an internal one) it is possible for an active Wifi device to introduce glitches in the playback stream.
Yes, I had an old laptop that was terrible about this. Easy hear, easy to test for. A question of system interrupts.

We have to remember that we are talking about two different things. The data itself (easy to transport) and the playback. Different playback methods may affect the system in different ways. PSU noise RFI, etc. That should also be easy to measure.
 
I did a few loopback tests (burn CD -> CD player -> SPDIF -> PC) ; once the original and recorded wave are aligned, basically you get two types of CD players :

- Those that work : zero bit errors unless the CD has lots of scratches and damage.
- Those that don't work : usually all samples are wrong due to a buggy volume control that can't reach 100% (ie, CD723), or other software bugs.

This has no relation to price or "quality" of transport mechanic. Even a $1 pickup in a cheap junk CDP can read a CD perfectly as long as it isn't full of scratches.

Doing a loopback test is easy and even on a PC setup it will tell you if you have some piece of software somewhere misbehaving without your consent, perhaps applying some resampling or volume control using not so good algorithms.

I have no problems believing the guys who report "mystical effects" like stuff sounding different on wifi, ethernet or HDD even if the bits are the same. I've got a laptop where the internal power supply is so bad that every time the HDD head actuator moves, the headphone output makes a very audible noise. And sticking a 2.4GHz radio transmitter near RF-sensitive audio opamps "may" affect the results...

I've even got a small PC where the MLCCs in the power section sing according to load current. If you plug a USB DAC which is going to make the chipset wake up and do some processing at 8 kHz, drawing some current ... then ... the PC emits a 8 kHz tone, without being connected to a speaker ! (if it's a USB1 dac, then it emits a 1 kHz tone)... makes a nice sonic CPU load indicator !!! What a P.O.S...
 
Last edited:
If you are transferring data over a TCP network, there will not be dropped packets visible to the user, by definition of the protocol. There may be plenty of behind the scenes drops, but the whole design intent of the protocol is to hide that from the user.

Of course there can be delays that, in an application like audio streaming, have the same effect as data loss. But a CD quality stream (16/44) is nothing for modern networks, wired or wireless. If you are getting delays/studders/skips in your stream, your network is not setup correctly, or is overloaded. I suppose if multiple people are streaming ultra HD video over the same wireless network you could get into trouble. But again, this would be blatant degradation of the sound, not some nuanced effect.

It's all buffered in a computer's RAM anyway. Main memory can of course have errors, but falling memory usually manifests itself in much more obvious ways. I.e., your system will be very unstable. There's no 100% perfect test for RAM, but running something like prime95 or memtest86 for 48 hours will provide as much confidence as you can reasonably have.

If you burn two CDs and believe they are not bit perfect copies of each other, there are easy ways to test. If your system has two optical drives you can treat both discs as data blobs and do a byte by byte compare. Only one drive? Treat each disc as a data block and compute a checksum, like sha512, on each disc. If the checksum matches, the data is the same (within the statistical probability, which is like 1 in some ludicrously huge number).

I could conceive of bitwise identical CDs sounding different in a very contrived situation. Say one burn was done on poor quality media - the data is there, but the reader has to do an awful lot of extra re-reads to compensate for the marginal media quality. If there is no buffering anywhere downstream, and the DAC is synchronous, then I suppose the stars could align just right such that the timing degrades sound quality. But in this scenario, I think there's a fine line between obvious degradation (skips/pauses/repeats) and nuanced "audiophile effects".

Back to networks - now that I think about it, there are at least three buffers for incoming data: the network adapter itself, the operating system's network stack, and the application itself. At an abstract level, it's the same for a local disk: disk cache, OS read cache, application buffer. So unless either device is already saturated, a 16/44 stream is a trivial load for reasonably recent, correctly functioning hardware.
 
Some time ago I tested a DIY ethernet/FPGA board. I put two 10m Cat5 cables in series with some crap adapter, the board at one end and the PC at the other end with a 5€ realtek NIC.

Loopback test, 100% link utilization transmitting UDP data. Unlike TCP, UDP does not auto-retransmit if the packet gets lost, which makes it better at realtime data, since you can handle the ACKs/retransmissions yourself.

Number of bit errors / lost packets : zero.

It ran for a week, that's a lot of bits...
 
Last edited:
If you are transferring data over a TCP network, there will not be dropped packets visible to the user, by definition of the protocol. There may be plenty of behind the scenes drops, but the whole design intent of the protocol is to hide that from the user.

Of course there can be delays that, in an application like audio streaming, have the same effect as data loss. But a CD quality stream (16/44) is nothing for modern networks, wired or wireless. If you are getting delays/studders/skips in your stream, your network is not setup correctly, or is overloaded. I suppose if multiple people are streaming ultra HD video over the same wireless network you could get into trouble. But again, this would be blatant degradation of the sound, not some nuanced effect.

It's all buffered in a computer's RAM anyway. Main memory can of course have errors, but falling memory usually manifests itself in much more obvious ways. I.e., your system will be very unstable. There's no 100% perfect test for RAM, but running something like prime95 or memtest86 for 48 hours will provide as much confidence as you can reasonably have.

If you burn two CDs and believe they are not bit perfect copies of each other, there are easy ways to test. If your system has two optical drives you can treat both discs as data blobs and do a byte by byte compare. Only one drive? Treat each disc as a data block and compute a checksum, like sha512, on each disc. If the checksum matches, the data is the same (within the statistical probability, which is like 1 in some ludicrously huge number).

I could conceive of bitwise identical CDs sounding different in a very contrived situation. Say one burn was done on poor quality media - the data is there, but the reader has to do an awful lot of extra re-reads to compensate for the marginal media quality. If there is no buffering anywhere downstream, and the DAC is synchronous, then I suppose the stars could align just right such that the timing degrades sound quality. But in this scenario, I think there's a fine line between obvious degradation (skips/pauses/repeats) and nuanced "audiophile effects".

Back to networks - now that I think about it, there are at least three buffers for incoming data: the network adapter itself, the operating system's network stack, and the application itself. At an abstract level, it's the same for a local disk: disk cache, OS read cache, application buffer. So unless either device is already saturated, a 16/44 stream is a trivial load for reasonably recent, correctly functioning hardware.
 
It's all buffered in a computer's RAM anyway. Main memory can of course have errors, but falling memory usually manifests itself in much more obvious ways. I.e., your system will be very unstable. There's no 100% perfect test for RAM, but running something like prime95 or memtest86 for 48 hours will provide as much confidence as you can reasonably have.

Back to networks - now that I think about it, there are at least three buffers for incoming data: the network adapter itself, the operating system's network stack, and the application itself. At an abstract level, it's the same for a local disk: disk cache, OS read cache, application buffer. So unless either device is already saturated, a 16/44 stream is a trivial load for reasonably recent, correctly functioning hardware.

Defintly agree in relation to the RAM, and depends on the rest of the architecture as well ! A ROMed OS is not the same than a HardDisked or SSD ones ! W is not Linux etc.

If streaming16/44 Data is trivial regarding the bytes, it seems to be not trivial in relation to the jitter, even between the different hardwares in a same PC unit and measured at the outputs: USB e.g. !

Could be interresting to compare the best compact SD card unit seen in some projects with a Linux on a recent PC with a good disks management about the partitions with some SSD disks, configured for sound streaming !

No fragmented data disks, and of course bytes checked (checksum of the reccorded datas) VS buffered TCP from outside in this same PC

And with two PC and a same way: so datas already indide the PC e.g. (not talking about data reception from outside) to its output ! Interresting what Peufeu measured himself : things can be heard with the same dac output and different PC !

Now with a same PC and different asynchronous cards with the same crystals (eg: the DIYINKH and the Wave I/O) people listen to subjective differences with the same pc and dac! Xmos programmation, interlinks between boards, etc ?

The problem is to know where measuring to solve the problems after ! When I read than one of the engineer of the SB touch said than the little embeded smps of the spidf output is the faulty jitter generator despite the crystal to stream it, nothing can surprise me anymore. Identify the faulty things, the causes is a difficult and subtle job and generate such phrases as reported in the first post ! Very difficult to know what we are talking about without heavy investigations.

Is it all about jitter and how its look like between the datas and the input of the dac chip ? I.E. the form of the square wave signal in relation to an ideal perfect theorical pulsed reference signal ? Some few bytes who diseaper between the RAMs, hardwares and the dac chip which can affect the sound ?

I didn't really understood what Pano explained about glitches and streaming as I have a poor understanding about jitter (or any other origin) and audio streaming inside the hardwares (I'm not talking about wide networks but inside a PC to the dac chip or a cd player to the dac chip)

Interresting to read than some have already made some tests and measurements. Does a FIFO with a clock generator glued on a DAC chip, isolator for the ground loops, noises, are solving everything ?

Layout, hardware, PS management before the dac chip are one of the gulty ?

Without saying if the checksum of our data librairies is the same after some hundred hours (lost sectors, etc !)
 
Last edited:
Defintly agree in relation to the RAM, and depends on the rest of the architecture as well ! A ROMed OS is not the same than a HardDisked or SSD ones ! W is not Linux etc.

Well, they are. On a typical PC architecture computer, the output is handled by a controller that handles the actual output of the data independently of the main CPU. All the CPU needs to do is occasionally fill up the buffer, but that is not time-critical.

The problem is to know where measuring to solve the problems after ! When I read than one of the engineer of the SB touch said than the little embeded smps of the spidf output is the faulty jitter generator despite the crystal to stream it, nothing can surprise me anymore. Identify the faulty things, the causes is a difficult and subtle job and generate such phrases as reported in the first post ! Very difficult to know what we are talking about without heavy investigations.
And that has of course been done, over and over again, by professionals.

There are two basic DAC clocking designs - you either derive your audio clock from the incoming data clock, or you generate your clock independently in the DAC, and use a FIFO or asynchronous sample rate converter to bridge the two clock domains.

The former, simpler design can be affected by incoming data jitter. That jitter is caused by the data clock of the output device, and to some degree affected by the actual interface electronics (USB, SPDIF or whatever) and thus possibly even their power supply noise.

Is it all about jitter and how its look like between the datas and the input of the dac chip ? I.E. the form of the square wave signal in relation to an ideal perfect theorical pulsed reference signal?
Not so much the shape of the signal as the stability of it.

Some few bytes who diseaper between the RAMs, hardwares and the dac chip which can affect the sound?
Bytes don't disappear.

Interresting to read than some have already made some tests and measurements. Does a FIFO with a clock generator glued on a DAC chip, isolator for the ground loops, noises, are solving everything?
As related to incoming data, yes. The ultimate output jitter of the DAC is still mostly determined by the stability of the DAC clock.

Without saying if the checksum of our data librairies is the same after some hundred hours (lost sectors, etc !)
If you have data corruption/loss on your disks, then you have bigger problems than just possible jitter in your audio. Do your spreadsheets occasionally lose some numbers?
 
Defintly agree in relation to the RAM, and depends on the rest of the architecture as well ! A ROMed OS is not the same than a HardDisked or SSD ones ! W is not Linux etc.

If streaming16/44 Data is trivial regarding the bytes, it seems to be not trivial in relation to the jitter, even between the different hardwares in a same PC unit and measured at the outputs: USB e.g. !

Could be interresting to compare the best compact SD card unit seen in some projects with a Linux on a recent PC with a good disks management about the partitions with some SSD disks, configured for sound streaming !

No fragmented data disks, and of course bytes checked (checksum of the reccorded datas) VS buffered TCP from outside in this same PC

And with two PC and a same way: so datas already indide the PC e.g. (not talking about data reception from outside) to its output ! Interresting what Peufeu measured himself : things can be heard with the same dac output and different PC !

Now with a same PC and different asynchronous cards with the same crystals (eg: the DIYINKH and the Wave I/O) people listen to subjective differences with the same pc and dac! Xmos programmation, interlinks between boards, etc ?

The problem is to know where measuring to solve the problems after ! When I read than one of the engineer of the SB touch said than the little embeded smps of the spidf output is the faulty jitter generator despite the crystal to stream it, nothing can surprise me anymore. Identify the faulty things, the causes is a difficult and subtle job and generate such phrases as reported in the first post ! Very difficult to know what we are talking about without heavy investigations.

Is it all about jitter and how its look like between the datas and the input of the dac chip ? I.E. the form of the square wave signal in relation to an ideal perfect theorical pulsed reference signal ? Some few bytes who diseaper between the RAMs, hardwares and the dac chip which can affect the sound ?

I didn't really understood what Pano explained about glitches and streaming as I have a poor understanding about jitter (or any other origin) and audio streaming inside the hardwares (I'm not talking about wide networks but inside a PC to the dac chip or a cd player to the dac chip)

Interresting to read than some have already made some tests and measurements. Does a FIFO with a clock generator glued on a DAC chip, isolator for the ground loops, noises, are solving everything ?

Layout, hardware, PS management before the dac chip are one of the gulty ?

Without saying if the checksum of our data librairies is the same after some hundred hours (lost sectors, etc !)

Really don't need to, you wont find a ghost in the machine, data transfer is solid, and you really have to hammer a PC to cause problems, just having one running say squeezebox is a breeze for them.
PC memory, parity check bits etc. etc.

As to the SB Touch comments, I would like more elaboration sounds fishy to me....

As to Jitter, I have my own views on that, quite honestly I think that a lot of the figures people worry about are in the realms of fantasy land. like many other things in audio it gives people some demon to worry about and the Audiophile Gurus another area to receive more worship for solving an non problem with some crazy ideas (external clocks with wires down to the board), these add-ons probably add more jitter than they solve but I do believe that the added noise is perceived as a sound improvement.
Another thread has multiple dacs with there outputs being summed, yet there are different start up times for each DAC (a few clock cycles apart), and the resultant sound is perceived as better, my question is this real hi-fi or is it just people playing who are way out of their depth in understanding....
Sorry to rant but I have just spent a few weeks working on near-space avionics and we have less concerns about jitter, cables etc that it seems there is on audio forums, yet we could solve any problem with engineering.
Data read from an SSD or an HD will be exactly the same, no difference none what so ever...
memory problems will be catered for in the data transfer...
Arghhhhhhhhhhh:tilt::tilt:

So glad to be back in the real world of electronics (audio) much more critical than life/mission critical design😀😀
 
As to Jitter, I have my own views on that, quite honestly I think that a lot of the figures people worry about are in the realms of fantasy land.

Yes, it is like worrying about 0.01% vs. 0.02% harmonic distortion. I guess modern audio systems are just too good, so people have to resort to voodoo to feel "in control".

Sorry to rant but I have just spent a few weeks working on near-space avionics and we have less concerns about jitter, cables etc that it seems there is on audio forums, yet we could solve any problem with engineering.
Ah, but that is just rocket science, not something *really* hard and mysterious, like audio... 🙂
 
Near rocket science...🙂
Just going to start redoing a router/switch design, again if your audio is streamed through one of these the data is again buffered and re-clocked....
I would love to do an integrated design, DAC pre-amp active crossover (digital or analogue) as one module, it could be made very small, in fact a whole system would be possible these days, you could then remove the long connections between units, control ground loops properly, minimise signal distances maximise signal integrity... Even have the storage in there... possibly only omitting power amps so active speakers could be used. One PCB, designed optimised for signal integrity from beginning to end, data from storage to DAC a few mm's so do everything to minimise causes of jitter.
 
Status
Not open for further replies.